It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one....If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight. In the end, truth will [win] out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information.

12
comments:

This study shows why skepticism is so important. If we know we are likely to deceive ourselves, we are encouraged to subject our beliefs to the scientific method as often as possible, to open up our own beliefs and assumptions to as much scrutiny as possible, to play the devil's advocate on ourselves... That is what I try to do. I try to always ask myself whether I am being consistent in the way I treat the evidence for and against my beliefs, and have always found that very helpful.

The problem is that when we are dealing with life's ultimate issues there's no agreement among the professionals. The facts are there but there's different interpretations of those facts. Both sides must punt to mystery.

The fact that there is no sucessful naturalistic explanation for the origins of the universe, life, human conciousness, logic, the moral law, the resurrection etc.

On the other side we have the problem of suffering that has not been sucessfully answered without punting to mystery or "I don't know why"

Cole, what are you going on about?Can you define "professionals"?What do you mean by "naturalistic explanation"?Before we can agree on facts, we need to have some definitions of the terms you are using.What do you mean by "human consciousness"? To me it means the opposite of being unconscious and unconsciousness can be classified into 3 or more stages, medically speaking. Animals can also be conscious or unconscious, so how does "human consciousness" differ?

But, the adage that "the facts have a liberal bias" is a great demonstration of just this very phenomenon. It is not simply Christians or "conservatives" that hold articles of faith dear. The idea that "liberals" are honest people without bad faith has been exposed by the recent "Journolist" scandal, but I hardly think this will change the minds of the most hardened "progressives," any more than exposure to the despicable actions of Rove and company would for neo-con counterparts.

Isn't this an important argument against using ridicule to persuade? Especially if Christians in general are already trapped by guilt and shame?

From the article:

Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

This article relates to the developmental psychology term "disequilibrium" as described by Piaget. He found that we learn when we find our understanding of the world isn't helping us solve a problem, because we are then motivated to try something new. Teachers take advantage of this, not by piling on more facts, but by asking questions students can't answer to increase their curiosity and to let them experience disequilibrium.

Teachers also know that the level of diseequilibrium has to be optimal. If there isn't much, a students aren't motivated to change, too much, and the experience is overwhelmingly anxiety provoking, so they don't try to figure it out. They just retreat to what they already know.

In my last comment I should have added that stories of conversion and deconversion are often good examples of disequilibrium. One has a nagging unanswered question or some type of personal or emotional experience that gets them thinking and questioning in a way they haven't before. For example, my own spiritual crisis began when I started questioning why people would be sent to hell for not believing in Jesus, esp. when they hadn't been taught or when they were raised in a culture that would make it nearly impossible to believe in Jesus. My faith hasn't given me good answers to it and I have strong emotional feelings about it. That has motivated me to reevaluate the facts.