The number of published opinion polls into British general election voting intention hit a twenty plus year high in 2009 with 141 polls carried out during the calendar year.

That is the highest figure since at least 1987 (when my records commence)* and more that completes the polling industry’s recovery from its post-1992 nadir. You can see the quarterly trend in this graph:

The 1992 general election was a bad one for the British political polling industry. During the campaign, the vast majority of polls put Labour ahead and of the final round of polls three put Labour ahead, one put Labour and the Conservatives neck-and-neck and only one – Gallup – gave the Conservatives a lead, but even that was a mere 0.5%. The actual result? A Conservative lead of 7.6%.

The response of the polling industry was a series of post-mortems and experiments with changes in methodology. Amongst those who commissioned polls, though, the response was also one of greater scepticism of the value of commissioning polls. Add in first the economic pressures of the 1990s and then the widespread seemingly inevitability of a Labour general election victory after Tony Blair become Labour leader, and it is no surprise that during the 1992-97 Parliament the number of opinion polls was consistently lower than in 1987-1992.

The prospect of an election more keenly contested than any since 1992, the declining cost of polling (thanks to phone and then internet polling) and the wider range of outlets commissioning polls (including in 2009 Political Betting) has resulted in the bumper year of polls.

Whether politics or political commentary is the better for this is another matter, particularly when you bear in mind that the standard margin of error on polls is +/- 3 percent. A sequence of two polls, one showing a party one percent higher than the previous really tells very little about any actual possible change in party support because the two margins of error overlap so heavily.

That’s a point often lost in commentary where changes well within that overlap are breathlessly described as party support rising, nose-diving, shifting and responding to events.

But of the 141 polls across 2010, for example, only 13 showed a statistically significant shift** in Conservative support from the previous poll by the same pollster and only 16 showed such a shift for Labour. Just 11 showed a statistically significant shift in Liberal Democrat support.

Headlines along the lines of “Sorry, no news from our latest poll as the changes are all too small to be statistically significant” though are not exactly common.

As the number of polls looked at grows, conclusions can be drawn with more confidence about changes in party support which would not be significant if they were present in only one poll. But that requires reports to put polls in their proper context alongside other recent polls, something with the traditional media is still very poor at. Hopefully 2010 will see that become the norm with the traditional media starting to catch up on the standards of political poll reporting which are common across political bloggers.

* Although there are many records of public opinion polls published before then, none that I’ve found provide enough information across all the polling companies to replicate this information in earlier years even when combining difference sources. It’s therefore more likely true that these are the highest figures ‘since records began’ 🙂 … but if you know otherwise, do let me know.

** Although an individual poll is usually statistically significant to +/-3% (assuming no systematic errors), for the change between two polls to be statistically significant it has to be greater than approximately 4.7%. That is because, for example, if a party’s support has stayed static at 35%, the first poll may show the party to be at 33% and the second at 37% – an increase of 4% which does not actually signify a statistically significant shift. An explanation of the maths is here.

3 Comments

Also don’t forget that “statistically significant” usually implies a 95% confidence interval i.e. 1 in 20 polls will show a significant change when there really is none. So around 10% polls showed a significant shift for each party – about half of those are Type I errors – which leaves only 5% of individual polls showing significant shifts.

I really don’t know where to post this as I have no access to any forums anymore and none of the people I’ve stayed in touch are not surprisingly from Somerset in the party. So here goes:

I have a whole bag of campaigning stuff that I want to see the back of – I have decided not to throw it or recycle it – I thought it would be good to donate it to a Lib Dem campaigner. I’ve decided that’s the lesser of two evils.

I have in the bag:

4 membership notebook things
2 A3 Lib Dem posters
A pack of LDYS cards with all the MPs on
A three tiered rosette
Britain after Blair
That Charles Kennedy book (hardback)
various badges plus tons of Lib Dem bits and bobs which I don’t want to fill land fill with…

It sounds weird but if you live in Somerset and want some stuff email me at jo_anglezarke[at]yahoo[dot]co[dot]uk

Ian: I think you’re doing some double-counting there, because the +/- 4.7% test for statistical significance when comparing two polls already takes into account that one poll may have been a rogue poll. Hence it’s 4.7% rather than 3%.

Post a Comment

Lib Dem Voice welcomes comments from everyone but we ask you to be polite, to be on topic
and to be who you say you are. You can read our comments policy in full here.
Please respect it and all readers of the site.

If you are a member of the party, you can have the Lib Dem Logo appear next to your comments to
show this. You must be registered for our forum and can
then login on this public site
with the same username and password.

To have your photo next to your comment please signup your email address with Gravatar.