The Maine Heritage Policy Center is always wrong about polling.

The Maine Heritage Policy Center and its news organ The Maine Wire are wrong about a lot of things. I could fill up a great deal of space on this blog discussing their shoddy research, crazy policy proposals, or farcically awfulpolitical cartoons. I don’t, however, because that would involve an investment of far more time than I’m willing to spend on the conservative think tank/reactionary weblog.

This week, however, they strayed into an area of particular interest for me and got things so horribly wrong that I feel compelled to respond.

In an article on The Maine Wire with no byline (they stopped attributing posts to individual authors shortly after a previous Maine Wire “reporter” was outed as a white supremacist), they claim that the results of a recent poll by Public Policy Polling showing Congressman Mike Michaud leading in the 2014 race for Governor should be discounted because the firm’s staff have made contributions to Democratic causes and because, in June of 2012, polling analyst Nate Silver calculated that the firm had a small “house effect” in favor of Democratic candidates. They also claim, in an echo of the “unskewed polls” nonsense from the 2012 election, that these alleged biases mean “LePage may actually be the frontrunner by more than 2 points.”

Information about the sponsor of a poll is important, as I’ve noted previously, but these objections are completely without merit.

In particular, MHPC seems to fundamentally misunderstands how Silver’s house effect calculations work and how they should be used. A house effect rating in this case is simply how far a polling organization’s surveys deviate from the average results of all surveys at a single point in time and are used to calibrate his prediction model. As Silver explains: “It is not necessarily correct to equate a house effect with ‘bias’ – there have been certain past elections in which pollsters with large house effects proved to be more accurate than pollsters without them – and systematic differences in polling may result from a whole host of methodological factors unrelated to political bias.”

House effects measured in this way can change a great deal over time, based on the assumptions made by the pollster and the results of other polls in the field. For example, Silver judged PPP to have a house effect of 1.5 points toward the Republican side in March of 2010.

House effects (especially those measured months before) also don’t necessarily have much to do with how close these pollsters will come to actual election results. If we look at the final Presidential vote, PPP came very close to the actual results and in fact overestimated Romney’s standing compared to Obama by 1.6 points, according to Silver’s rating methods. It’s very strange that MHPC would point to a single house effect rating as a signal of bias while ignoring PPP’s solid track record of accuracy when compared to actual election results both nationally and in Maine.

PPP’s track record is also a clear defense against MHPC’s claims that the principals’ contributions to Democratic candidates influence their results.

In 2012 PPP was the most accurate of any pollster that surveyed the three major races in Maine (President, U.S. Senate, Question 1). They had a combined adjusted error of just 11.83 points off the margins of victory in these three races, compared to 12.86 for MPRC, 20.78 for Critical Insights and 22.93 for Pan Atlantic SMS. (I’m proud to say that MPRC, for which I work, had the best overall record if the Congressional races are also included).

Sadly, this isn’t the first time MHPC has been completely wrong on an issue of public opinion research. In fact, the more involved they have attempted to become in polling, the more spectacularly they have failed.

In 2009 they released a poll they had commissioned through Critical Insights a week before the election showing TABOR, an initiative they supported, ahead by a two-point margin. (PPP’s poll that same week showed it losing by an 18-point margin.) On Election Day the initiative lost by 20 points.

Similarly, In 2011 they released a poll that they conducted themselves through Pulse Opinion Research using some very sketchy wording and methodology showing the People’s Veto of the elimination of same-day voter registration (a referendum they opposed) losing by a six-point margin. On Election Day it passed by 20 points.

With a record like this, I’m surprised they would even attempt to criticize others. MHPC shouldn’t be trusted to report accurate polling results or to fairly evaluate pollsters. In leveling unfounded accusations against PPP, they have only once again exposed their own biases.

About Mike Tipping

Mike is Maine's longest-writing political blogger and explores state politics and policy with a focus on analysis and explanation. He works at the Maine People's Alliance and Maine People's Resource Center.
View all posts by Mike Tipping →

Post navigation

Mike Tipping

Mike is Maine's longest-writing political blogger and explores state politics and policy with a focus on analysis and explanation. He works at the Maine People's Alliance and Maine People's Resource Center.