Polling Controversy Raises Questions of Disclosure

My print column this week examines a polling controversy. The American Association for Public Opinion Research publicly criticized Strategic Vision LLC for not cooperating with an Aapor review of 2008 primary polling. Blogger Nate Silver analyzed the firm’s numbers and said he found statistical irregularities. And the firm, though it didn’t respond to Wall Street Journal requests for comment, hasdefendeditswork in comments to other publications. “I stand by our work,” Mr. Johnson told Washington, D.C., paper the Hill last month. “We’ve done the work, and we can prove that we’ve done it.”

Mathematicians said the Silver analysis — finding that certain digits showed up far more often than others in Strategic Vision polls — was troubling but want to see more evidence. Jordan Ellenberg, a University of Wisconsin, Madison, mathematician, blogged that the case isn’t as persuasive as investigations into possible fraud in the Iranian election. “It’s not so substantial that I would have gone public with it, if it were me,” Ellenberg said, but he does think it merits further investigation.

“To strengthen the argument that Strategic Vision’s (or any other polling group’s) numbers seem unusual, the next step would be to assess the observed variation across a number of similar polling organizations and see where various groups fall,” said Lance Waller, a biostatistician at Emory University.

Strategic Vision is a Republican firm, but its critics — Aapor and the National Council on Public Polls, which also issued a statement calling for disclosure — aren’t partisan, and the firm’s polls haven’t been notably favorable to Republican candidates.

Two think tanks that are clients of Strategic Vision also are seeking more details on the firm’s methods in light of Silver’s analysis. The Goldwater Institute, which calls itself a free-market think tank, and the Oklahoma Council of Public Affairs hired Strategic Vision to test high-school students’ civic knowledge in Arizona and Oklahoma, respectively.

After Silver questioned the Oklahoma results as being too bleak, both think tanks sought verification from Strategic Vision. “Although I find it very unlikely that Strategic Vision manufactured this data, I have asked for receipts from the marketing firm from which they purchased the contact data just to make certain,” Matthew Ladner, vice president of research for Goldwater Institute, said.

Brandon Dutcher, vice president for policy for the Oklahoma group, isn’t making up his mind just yet. “I have requested voluminous survey data from them, as well as answers to some methodological questions — all of which I expect they can and will provide so that they can go about defending their firm and I can go about defending this survey,” Dutcher said. “If not, however, then of course I would want my money back and wouldn’t hire them again.”

Requests for further information from Strategic Vision in the past have met with positive results. Several polling aggregators, including Tom Silver of the Polling Report (no relation to Nate Silver), said that the firm has answered their questions.

But Nate Silver remains skeptical. “I think I’ve moved the ball a lot and built a pretty persuasive case,” he said, pointing to calculations posted on his site that suggest the probability of such a digit distribution arising by chance alone is about one in 5,000. “In an intellectual sense, the burden of proof is on their side.” Silver said he’d like to see the firm disclose the name of the call center it has hired and other details about its polls.

Other pollsters argued for more disclosure. “As an AAPOR member, I’ve pledged to disclose basic methodology in any poll I release, and I think the public has an absolute right to know the basics,” said Clay Richards, retired assistant director of the Quinnipiac University Polling Institute. Added Aapor president Peter Miller, “We are asking for information that any public-opinion researcher should provide so that study findings can be interpreted and replicated.”

Pollsters are unsure of how to identify bad polls. “I’m not sure there’s a gatekeeper mechanism that works,” said Mark Blumenthal, a former Democratic pollster who now covers the industry at Pollster.com. “I propose we do a better job of scoring disclosure to create incentives for pollsters to disclose more.” His colleague at Pollster.com, Charles Franklin of the University of Wisconsin, Madison, said, “If [Bernie] Madoff could fool the SEC not to mention investors, unscrupulous pollsters could also hide details or even fake some things in a far less scrutinized and regulated industry.”

Some pollsters do take steps to check their surveys. The Centers for Disease Control and Prevention’s National Center for Health Statistics monitors its phone surveys, and recontacts 5% to 10% of respondents to in-person surveys, according to a spokesman.

Aapor plans to spotlight pollsters that use sound methodology and disclose their methods. The thinking, said Miller, is, “how can we make this a positive for organizations, as opposed to a club with which to beat organizations when they fail to be transparent.”

Tom Jensen, communications director of the firm Public Policy Polling, said the news media bears some responsibility for reporting questionable polls. “I don’t think polls should be reported if basic information to see if the numbers are valid is not given, such as party breakdowns,” Jensen said.

About The Numbers

The Wall Street Journal examines numbers in the news, business and politics. Some numbers are flat-out wrong or biased, while others are valid and help us make informed decisions. We tell the stories behind the stats in occasional updates on this blog.