It's one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. "Whenever the people are well-informed, they can be trusted with their own government," Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it's an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won't it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It's this: Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

Make sure to read the rest. Along with Shankar Vedantam's 2007 and 2008 articles in the Washington Post, Keohane's article is the best summary to date of what we know about political misperceptions and the difficulty of correcting them.

This is a very interesting study. I recently posted an entry on my blog about the same subject. I've now posted an update linking to your study and the Boston Globe story.

After reading your report, I needed to fact check your survey questions and I've found that the "corrections" are all technically correct. They are indeed factual, even though they don't always tell the complete story.

For instance, Tax revenues after the Bush tax cuts did not increase, but they didn't significantly decrease either. And while no stockpiles of WMDs were found in Iraq, there is photographic evidence of a suspicious 50 truck convoy (contents unknown) from Iraq to Syria days before "Shock and Awe."

I am slightly concerned that your highly politically educated subjects may have bulked at the corrections for that reason. Despite those concerns, I believe that your results do reasonably support the psychological phenomenon you have described.

You know why people don't believe what they read in the newspapers? Because reporters like Keohane, writing a story with full access to all principals and under no deadline pressure whatsoever, can't manage to keep straight whether the researchers about whom he wrote were at Michigan or Duke. Admittedly it makes no difference to the gist of his article, but if a reporter can't get the simple facts right, why should anyone have confidence he'll accurately summarize the scholarly articles or understand their conclusions?

I think I remember reading something about that in my psychology class. I must say, if you ever move to a place where most people don't share your political views you will see it immediately. It was a good summery of the research that I have seen though.