Political surveys survive response fall-off, Pew finds

The Pew Research Center’s latest detailed study of survey quality paints a worrisome picture for survey response rates, but again shows that a well-done telephone poll — even one with response rates in the high single-digits — accurately represent the U.S. population on a broad range of political and demographic measures.

The apparent accuracy of results from a survey where fewer than one in 10 people completed the interview parallels previous studies showing a weak connection between response rates and survey quality. A 2008 study of more than 100 surveys, including many media polls, concluded that “lower response rates do not notably reduce the quality of survey demographic estimates.”

The centerpiece of the new Pew report was a specialized version of their standard five-day national survey, which dials randomly selected landline and cellphones. Instead of asking a series of questions about national politics, Pew included a battery of questions comparable to high-response rate surveys conducted by the federal government.

Pew also compared these results to voter and consumer databases as well as results from a “high-effort” approach, which made at least 15 attempts to interview initial non-respondents, including cash incentives, letters of encouragement and interviewers with special training in getting responses from people disinclined to answer. Even after all the effort, that survey achieved a response rate of 22 percent.

The standard Pew survey came close to government estimates of citizenship, voter registration and a variety of other demographics (though it overestimated educational attainment), and was similar to voter databases on party registration and consumer databases on economic indicators.

One key miss in the Pew standard survey — with implications for the accuracy of many non-political surveys — is on measures of political activism. Respondents in the Pew survey were three times as likely to say they had contacted a public official in the past year as those in the government’s Current Population Survey (31 vs. 10 percent). The Pew survey also vastly overestimated the number of people who volunteered and talked regularly with neighbors.

Since voting is a parallel gauge of activism, the effects on election polls might be limited, but the potential bias on other types of surveys is evident. Moreover, to the extent that political campaigns manage to motivate previously disengaged adults to the ballot box, political surveys might be skewed as a result.

Another thing that serves to minimize possible error in political polls is that the overestimation of community activism is unrelated to partisanship or ideology. Democrats outnumbered Republicans by five points in the overall survey and six points when the results were adjusted to match government estimates of volunteering.

The full report is a must-read for poll watchers, as such detailed reports on survey quality are few and far between.

(Calculations of response rates vary, but the average AAPOR RR3 — response rate #3 — in Washington Post-ABC News polls this year is 20 percent for landline samples and 16 percent for cell-only samples.)

Scott Clement is the polling director for The Washington Post, conducting national and local polls about politics, elections and social issues. He began his career with the ABC News Polling Unit and came to The Post in 2011 after conducting surveys with the Pew Research Center's Religion and Public Life Project.