Why Do Voters Believe Lies?

Neil Malhotra is associate professor of political economy at the Stanford Graduate School of Business.

“There is no such thing as a federal personhood bill.” Or so said Colorado Rep. Cory Gardner, the Republican candidate currently locked in a tight Senate race against Democratic incumbent Sen. Mark Udall, in an interview a few weeks ago. It was a surprising statement—not only because the federal personhood bill, otherwise known as Life Begins at Conception Act, does in fact exist but also because Gardner himself co-sponsored it. “This is all politics,” he added, blaming Udall for spreading untruths about him.

It was, indeed, all about politics. Gardner’s strong support of personhood legislation might have bolstered his popularity among conservative Republicans. But after declaring his Senate bid, Gardner found himself having to appeal to a more moderate electorate (Colorado voters have repeatedly rejected a personhood ballot measure) and changed his position on the issue. So far, his equivocation hasn’t hurt him.

Story Continued Below

If Gardner wins on Election Day, he certainly won’t be the only politician to get away with not being totally transparent, and it prompts the question: Why do voters fall for misinformation? A common refrain these days is that this is because there is a plethora of “low information” voters. If only those citizens knew more about politics, the argument goes, then the problem would be solved. But in fact, the problem is much more complex: It is often the people who are most interested and informed about politics that are most likely to adopt false beliefs. After all, there has been an explosion of “fact-checking” organizations out there, and they don’t seem to be breaking through. PolitiFact gave President Barack Obama’s “if you like your health plan, you can keep it” statement about the Affordable Care Act a “Pants on Fire” rating, but that hasn’t stopped liberals from defending his claim.

This is because human beings have a tendency to engage in what is called “motivated reasoning,” meaning the way they view evidence and come to make decisions is not to better understand “the truth” but, rather, to achieve self-serving goals. One goal could be the reduction of cognitive dissonance, or holding two incompatible opinions in one’s mind at the same time. Suppose you supported the Iraq War because of President George W. Bush’s argument that Iraq possessed weapons of mass destruction. When those weapons were not found, you would be forced to reckon with two incompatible ideas: (1) The Iraq War was good policy based on an accurate assessment of the threat; (2) There were no weapons of mass destruction found in Iraq. The dissonance has to be resolved somehow. You could change your position on the Iraq War, but the easiest thing to do is to just to believe that weapons of mass destruction were in fact discovered.