10 Things Pollsters Won't Tell You

Why you should think twice about all those survey results.

1. "The way we ask the questions can determine the answers."

As voters around the country cast their ballots Tuesday, they may feel their decision is pretty clear cut. But when pollsters ask questions, the answer isn't always as simple as checking a box. Pollsters "can influence the results with artful word choices," says Michael Cornfield, acting director of the Political Management Program at George Washington University in Washington, D.C. Pollsters who work for or are paid by particular causes or candidates can easily skew results in their clients' favor based on the way the questions are worded, adds L.J. Shrum, professor and chair of marketing at the University of Texas at San Antonio College of Business. "But this wording is often not made clear when the results are reported."

Two questions speaking to the same issue can result in very different responses. Take, for example, "Do you favor cutting government spending to balance our budget?" Or, "Do you favor cutting Medicare for the elderly and Social Security for the poor?" What's more, politicians often take their lead on issues from polls, experts say. Say in the above example a good percentage of respondents answered yes to the latter question. That, in theory, could lead elected officials to believe that voters were aggressively seeking a cut in Medicare and Social Security, he says.

The placement of a question can have an even greater impact on poll results than the wording, particularly when it comes to politics and social issues, says Scott Keeter, director of survey research at Pew Research in Washington, D.C. "If I ask you about the financial troubles the country is facing and then ask you about the job that Obama is doing as president you may be less likely to give him a favorable rating," says Keeter. "But if I ask you about targeting senior figures in Al-Qaeda and the killing of Osama Bin Laden first, you may rate the president better based on his security record."

Pollsters say they avoid this problem by asking the same question in different ways to reduce bias and error. "It simply isn't as easy as asking, 'If the election for president were tomorrow, for which of the following candidates would you vote?'" says Brandon M. Macsata, managing partner at the The Macsata-Kornegay Group, a consulting firm in Washington, D.C.

This story has been updated; it originally ran on Jan. 10, 2012.

2."Our results are manipulated or just plain wrong."

love to get consumer opinions on everything from the cheapest plane tickets to the most popular brands of the moment. But some consumer questionnaires are backed by companies that have their own agenda. Pollster Peter Graves, for example, conducted a survey for a business magazine about the best employers in a particular state. "The editor of the magazine massaged the results so that his best advertisers were the best employers to work for," Graves says. "He wanted to make sure his buddies came out on top of the list." Graves, who was shocked by the incident, explains that his poll was crucial in highlighting the salary, working conditions and financial benefits received by employees. Potentially the results could encourage readers to work for a particular company on the list without realizing that the results are not what they seem, he adds.

Poll results can also be subject to misinterpretation and errors, experts say. Both Continental and American Airlines complained after they received bad ratings on a customer service survey released after Hurricane Irene hit last year. Turns out the survey, which was conducted over the phone and on Twitter, included results from defunct Twitter accounts. An American Airlines spokesman says the company handled over 100,000 calls on Aug. 26, 2011, the Friday before the hurricane, and customers waited an average of 21 minutes -- not the 1 hour and 32 minutes quoted in the survey. Consumer advocate and author Christopher Elliott says the airlines had good reason to complain. Consumers often choose their airlines based on service as well as price, he says.

3."People lie to say what they think is acceptable."

Survey respondents tend to stretch the truth when they are asked questions they deem to be taboo, analysts say. This can lead to polls under-representing how often people might, say, cheat on their taxes or pad their work expenses. The phenomenon even has a name. "It's called social respectability bias," Cornfield says. Here's a classic example. When asked something like, "Would you vote for a Mormon (or an African American, or a gay person) for president?" people know what the respectable answer is and they're inclined to give it, says Cornfield. "But they don't necessarily do what they say," he adds.

Good pollsters can manage this problem by attempting to normalize undesirable behavior, Keeter says. Before asking someone if they've registered to vote, for example, they may say something like, "A lot of people say they have yet to register. Are you one of them?" This makes it easier for people to admit to something that they may not be proud of.

And sometimes stretching the truth leads to a good end. "People tend to over-represent how often they give to charity," says Keeter. He says this could -- in theory -- encourage people to give more if they genuinely believed their friends and neighbors were doing the same. According to an annual poll conducted by the International Charities Aid Foundation, the U.S. ranked as the most generous in the world in terms of time and money in 2011, up from No. 5 in 2010.

4."The frontrunner should be afraid, very afraid."

President Obama's shifting fortunes in the polls over the months leading up to Tuesday's election are a textbook example of the pitfalls of an early lead. Frontrunners often fall under the unwelcome spotlight of the media, and often have more ground to lose than an upstart challenger, analysts say. Obama and his Republican rival Mitt Romney are now deadlocked, polls show. As voting began, Obama led 48% to 47% -- a difference of just seven voters among a pool of 1,475 surveyed, according to the latest Wall Street Journal/NBC poll, which has a margin of error of plus or minus 2.55 percentage points. But back in July, Obama had a clear lead of 49% to 43%. Keeter says polls can't change public opinion in a direct way, but they can create a "bandwagon" or "underdog" effect.

Romney reaped such rewards of trailing in the polls after his strong performance during the first debate. That Obama held his ground in the second and third debates did little to reverse the damage done in the first one, experts say. "The performance on the first debate made the anti-Obama intensity factor more relevant," Macsata says. "The more conservative elements of the Republican party started to find Romney a little easier to stomach."

Romney has benefited from being the underdog throughout this campaign, experts say. During the Republican primary season, Herman Cain was in the lead before accusations of sexual harassment brought down his campaign. Cain publicly denied the accusations, and said they were all false. And, back in July, Rep. Michelle Bachmann overtook then-leading candidate Mitt Romney, 25% to 21%, among likely Iowa caucus-goers, according to a poll commissioned by TheIowaRepublican.com. Oops. As we all know, Romney squeaked by with eight votes to win Iowa and Bachmann dropped out of the race.

5."I conducted this poll in my mother's basement."

Every Tom, Dick and Harry seems to be carrying out cheap and often cheerful online polls paid for by marketers that don't adhere to industry standards. When they are quoted in magazines or newspapers, few consumers know the difference, says social psychologist Matt Wallaert. The Council of American Survey Research Organizations and American Association for Public Opinion Research requires members to adhere to strict standards, such as being transparent about who paid for the poll and not pressuring people into answering certain questions.

Intraday Data provided by SIX Financial Information and subject to terms of use.
Historical and current end-of-day data provided by SIX Financial Information. Intraday data
delayed per exchange requirements. S&P/Dow Jones Indices (SM) from Dow Jones & Company, Inc.
All quotes are in local exchange time. Real time last sale data provided by NASDAQ. More
information on NASDAQ traded symbols and their current financial status. Intraday
data delayed 15 minutes for Nasdaq, and 20 minutes for other exchanges. S&P/Dow Jones Indices (SM)
from Dow Jones & Company, Inc. SEHK intraday data is provided by SIX Financial Information and is
at least 60-minutes delayed. All quotes are in local exchange time.