Admin

In the aftermath of the momentous Greek elections, this seems worth mentioning: the Greek exit polls were fairly close to the mark, and the last pre-election polls approached the actual election results almost as closely as the exit polls.*

The latter, especially, seems impressive, not least because their performance was hardly a given. Ahead of the May 2012 elections, the pollsters entirely failed to capture the dynamic of the electorate. (Admittedly, it’s not easy to poll a watershed election which all but broke up the entire Greek party system, and they did approach the results of the June 2012 elections much more closely.)

The 2015 elections: how the pollsters did

Click to enlarge: How closely did the last pre-election polls approach the actual election results?

This time, a Pro Rata poll which was in the field 5-6 days before the elections pegged the numbers closely enough to the actual results that it was off by an average of just 0.6% by party. Even the “worst” poll was only off by an average of 1.2% by party.

Interestingly though, to the extent that the polls in the last few days before the elections did miss the mark, there was a distinct pattern. When you calculate the average of each pollster’s final poll, it turns out to have understimated every anti-bailout party, whether on the left or right (Syriza, XA, KKE, ANEL), and overestimated every pro-bailout party on both the left and right (ND, Potami, PASOK, KIDISO). Sometimes the deviations were tiny (like a tenth of a percentage point), but it’s still a striking pattern.

In particular, the average of the final polls had the incumbent government party, New Democracy, 1.8% higher than the share of the vote it eventually received, while it had the Independent Greeks 1.1% lower.

Talking about those averages, there’s also something worth mentioning about them. There is always a fair amount of debate about whether it’s a good idea to just average out the polls from different pollsters, or even to apply more sophisticated aggregations the way Nate Silver does. But in this case it worked: the calculated average of every pollster’s final pre-election poll came closer to the actual result than the numbers of any single individual poll did.

Take a look at the Google spreadsheet with all the data I used about the final pre-election polls (either by awkwardly navigating the scrollbars below or by clicking this link)

The exit polls, meanwhile, were on balance too pessimistic for New Democracy, and a tad too optimistic for Syriza. Here’s the spreadsheet on those:

* To source opinion poll results I relied entirely on the seemingly exhaustive listing that was being maintained by the Wikipedia editors. It’s important to note that they adjusted every pollster’s numbers, using a simple rule of three, to disregard respondents who were undecided or said they would abstain from voting (either physically or by voting blank).