As online surveys become more prevalent, researchers face many challenges associated with this shift to increased web-based interviewing. We are often asked by clients how, if at all, data collected online differs from a more traditional telephone survey. Last month, Pew Research Center released the results of an experiment comparing data from two replicate national samples, with one collected over the telephone and the other self-administered on the web (link to the full report here).

In particular, the Pew report examines mode differences, that is differences in responses attributable to the mode in which the question is administered (in this case, telephone versus web). The highlights of this study and conclusions drawn by the Pew team provide valuable insights for researchers as they compare the relative benefits and drawbacks of these two dominant opinion research methodologies.

In general, the Pew team found that differences in responses by survey mode are common, but typically not large. In fact, many of the variations between the phone and web samples were not statistically significant and fell within the study’s margin of error.

However, on three distinct types of questions sizable mode effects were observed:

Political Measures

The experiment found that ratings of political figures were significantly more negative among the web-based sample. As the table below shows, web respondents were much more likely to give political figures a “very unfavorable”

The differences grew even larger when respondents rated political figures from the opposite party. Among Republicans, 53% described Clinton as “very unfavorable” on the web versus 36% on the phone. Similarly, on the web-based survey 63% of Democrats rated Palin as “very unfavorable,” compared to 44% in the phone sample.

A similar pattern emerged on questions that asked survey participants to rate the political parties using a feeling thermometer ranging from cold/negative to warm/favorable.

Mode differences also occurred on image questions for lesser-known political figures. The Pew team found statistically significant differences in the number of respondents who said they had “never heard of” a figure.

The research team hypothesized that the presence of a “never heard of” option on the self-administered web survey caused this particular mode difference. While the interviewers on the phone study were instructed to accept “never heard of” as a response, since it was not offered as an explicit choice to phone respondents, they were less likely to volunteer it as an answer.

Discrimination Measures

The Pew experiment uncovered significant mode differences on questions related to discrimination against different groups. Respondents in the phone sample were more likely to agree that groups faced “a lot” of discrimination than were people interviewed via the web-based survey.

Happiness and Life Satisfaction Measures

The third area where the research team found significant mode differences was on questions related to happiness and life satisfaction. In general, respondents being interviewed over the phone were more likely to give a positive/favorable response to these types of questions.

Similar trends emerged on questions related to the community where a respondent lives, with those interviewed over the phone more likely to give a positive/favorable response on issues ranging from traffic conditions to the area being a good place to live.

The researchers found some mode differences on questions asking respondents to describe their own community involvement and engagement, with telephone respondents reporting higher participation in several of these metrics.

On issues related to personal finances, web respondents were more likely to reveal that they have faced financial challenges in the past year compared to the phone sample.

The mode differences on these financial questions became more significant when looking at subgroups including low-income respondents and non-white respondents.

Key Takeaways

In general, the mode differences found by the Pew team follow the theory that some telephone respondents will give responses to make themselves look more favorable or responses that will be less likely to produce an uncomfortable interaction with the interviewer. Using this logic it makes sense that differences emerged on questions related to divisive political figures, queries about discrimination and questions about sensitive personal issues. However, the Pew team stressed that while there are some significant differences by mode on a handful of questions, an equal number have small or non-existent differences.

So… what exactly does this information mean for researchers?

There are strengths and weaknesses to both phone and web methodologies. While internet-based surveys might elicit some more candid responses from interviewees, phone-based studies continue to provide access to a more representative sample. It is up to researchers to examine the goals of each project and determine which methodology best suits the client’s goals and needs. Moving forward, the Pew research team hypothesized that mixed-methodology surveys may become more prevalent as researchers strive to find the balance between cost-effective interviewing, encouraging honest and thorough responses, and achieving a representative sample.

Public Opinion Strategies helped us to clarify what we wanted to learn and then conducted research and analysis that shed light even beyond the questions we set out to ask. They were very receptive to our suggestions, responsive to our queries, and flexible when we needed them to be.

Public Opinion Strategies has consistently offered unparalleled advice and spot-on polling that has shaped how and where we spent money and deploy key resources. Additionally, they have always been an excellent steward of limited campaign resources, ensuring we spend wisely and not a dollar more than necessary in order to get the information we need.

Robert Blizzard and Public Opinion Strategies did a great job for us throughout our successful campaign for Congress. Robert gave us accurate data, spot on analysis, and professional advice, all of which were essential to our victory.

Public Opinion Strategies is one of our go-to pollsters when it comes to testing public support for bond ballot measures and other initiative proposals. They are available to provide ongoing consultation with regard to crafting of ballot questions, public outreach messaging, and related efforts.

ACLI has worked with Public Opinion Strategies for decades, through several tough industry battles—often ones in which public opinion does not naturally fall on the side of insurers. Yet Bill and his team consistently provide invaluable strategic advice by refining our messages and helping us frame our issues in a way that makes them understandable and persuasive.

The data from Public Opinion Strategies provided important insight and informed our public awareness campaign. We sincerely appreciate their professionalism and expertise in this arena.

Nicole McCleskey and the team at Public Opinion Strategies have been invaluable to me, both during my campaigns and as Governor of the State of New Mexico. It’s not just the accuracy of their numbers, but guiding the overall strategy that makes them so valuable.

Public Opinion Strategies has been a part of our team in Missouri for more than a decade. With their data and guidance, Republicans here were able to attain a majority in the House in 2002 for the first time in fifty years, and we have been able to grow that majority to the point that we now have a record, veto-proof majority.

Lori Weigel from Public Opinion Strategies reviewed our needs and guided us toward asking the right questions. Her reporting was easy to follow and her interpretation of the data provided clear decision points.

In my tenure at two leading business associations, facing huge and complex consumer issues, I have benefitted enormously from the objective advisory skills of Bill and his team. They do their homework, they are rigorous, dispassionate and thoughtful. Turning questions into answers is a clever tag, but it’s also an apt description of the professional talents of the firm.

I consider Public Opinion Strategies to be a part of our team. That is the way we have always worked. They have helped us to understand our needs and fashioned research solutions to meet those needs. They have helped us to meet killer deadlines by being flexible, executing rapidly, and insuring quality. Teamwork is the best way to describe it.

Accuracy, speed, and deep knowledge of key issues and public sentiment are the hallmarks of quality opinion research, and on these measures Public Opinion Strategies consistently delivers. I have had the pleasure of working with Public Opinion Strategies for more than 15 years on dozens of issues, and they are undoubtedly the gold standard.

Public Opinion Strategies’ track record of success and wealth of experience in political campaigns and issue advocacy are why they are one of the most trusted and well respected public opinion firms in Washington, D.C. Their insights and perspectives have helped to inform a wide array of public affairs activities across multiple industries.