I’ll See It When I Believe It

I have a friend, who, as a feminist, is quick to criticize any comment or behavior that denigrates women. So, when Steelers quarterback Ben Roethlisberger was arrested for his seedy behavior with a coed in a Georgia bar, you would think she would have been all over it. But she’s also a lifelong Steelers fan, so she was actually one of his most ardent defenders. She just did not see enough evidence that he had actually meant to do anything wrong.

My friend exemplifies one of the most common shortcomings that we all have in our critical thinking skills, confirmation bias: the tendency to overlook or ignore any evidence that does not accord with the conclusions we have already reached. I’m not talking about people forming a conclusion and deliberately ignoring contradictory information; when you’re advocating a position that’s often exactly your job. What I am referring to is when you overlook information in spite of your best efforts or intentions. It may be harmless when it comes to judging a public figure, but it can have costly repercussions in more important areas.

Confirmation bias can cause us to ask questions in such a way that we get the answers we expect to hear, which is a common problem in market research. Motorola employed over 100 people and several consulting groups to conduct market research before launching its Iridium satellite phone service, and its survey questions indicated that there would be enthusiastic acceptance. One sample question began: “There will soon be a new personal telephone service which at reasonable cost will provide you with the capability to be reached or to place calls anywhere in the world using satellite technology, which is not limited in coverage like a cellular phone. To access the service you would have a small handset that fits in your pocket…” They did not mention that “reasonable cost” meant a handset price of $3,000 and $3 per minute usage fees, or that “anywhere” meant line of sight view with an orbiting satellite, which meant it could not be used indoors. (Billion Dollar Lessons, Carroll and Mui) In reality, only 3,000 were sold and it folded within a year of launch, costing $5 billion in the process. (Please see Iridium’s side of the story, at the end of this post.)

Motorola’s example demonstrates that if you try hard enough, you can usually find enough data to support your point of view. As psychologist J. Edward Russo says, “If you torture the data long enough, it will confess.”

How many people have been unjustly convicted through confirmation bias? 60 Minutes once ran a show in which three professional polygraphers were each asked to test a subject whom they were told was suspected of stealing a camera. There had been no theft, and each polygrapher had a different suspect, but all three judged their subject to be deceptive.

Confirmation bias is one of the reasons that first impressions can be so important. We will tend to notice only information that confirms the first impression. Sometimes, it can even lead to self-fulfilling prophecies. Because we mostly notice evidence that supports our initial impression, people tend to live up or down to our expectations.

It can also close off diverse viewpoints that may make us better informed in general. An Ohio State study showed that people spend 36% more time reading essays that support their positions, for example. The bias is one of the principal reasons that social prejudices can be so hard to eradicate. If I suddenly got it into my head that people in red cars drive aggressively, you can bet that I would see red cars everywhere and would not remember the safe ones.

What can you do about it?

By yourself:

Always solicit disconfirming evidence before making a decision. Ask yourself: what would be true if my hypothesis were not correct? Then try to find instances of that.

When you do come across disconfirming information, do like Darwin did, and write it down immediately, before you forget it!

Write down reasons why your assumptions and hypotheses might be wrong and try to prove them so.

With others:

Get others to try to poke holes in your ideas. As a side benefit, you will have much more confidence when you do present your ideas for final approval.

Try to frame your questions to seek disconfirming evidence. One successful industry analyst credits his success to this. For example, if he thinks price competition is decreasing, he will ask, “Is it true that price competition is increasing?”

Hold a PreMortem meeting. With some colleagues, imagine that it is some future time and your idea has failed. Try to figure out all the ways it could have happened. (The Power of Intuition, Klein)

If you’re a manager:

Ask your subordinates what other hypotheses they have considered or what evidence they have found that contradicts their point of view. If they say they haven’t found any, that could be a red flag.

Generate disagreement. Alfred Sloan, the Chairman of GM, once asked if everyone agreed with the decision made. He then said, “If we are all in agreement on the decision – then I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about.” One way to generate disagreement without generating defensiveness is to have a designated devil’s advocate. When someone is designated as the devil’s advocate, it can reduce personal tensions; just be sure to rotate the position frequently.

Foster an environment where it is OK to bring bad news and to dissent. In its after-action reviews following training exercises, the US Army allows the lowliest private to criticize the commander’s performance, and everyone gets better as a result.

In case you’re wondering, I did try to find data that disconfirms the existence of confirmation bias.

The rest of the story: After this post first ran, I received a gracious note from an Iridium spokesperson, who, while not disputing the facts I wrote, informed me that Iridium relaunched its service in 2001 and now has over 413,000 subscribers and 2009 revenues of $318.9 million. In addition, its technology has been a crucial help in many natural disasters. It just goes to prove the old saw that sometimes failure is just an invitation to begin again more intelligently.

As we’ve discussed below, I think that many people grow up with tons of influences that cause them to become confused about the difference between their “core self” and their beliefs. It’s been my experience that when these concepts get enmeshed, evidence that disconfirms our current beliefs can seem like threats to our core selves.

Because of this, I believe that helping people to change beliefs based on evidence (including overcoming confirmation bias) often requires that we first help people untangle the experience of one’s self from the experience of one’s belief system. If we fail to accomplish this step, all the evidence in the world can be wasted on someone who is terrified of being convinced that their beliefs are incomplete or invalid. Important stuff – thanks again for bringing topics like this up, Jack!

Brian, you bring up an interesting point, which has also made me think back to my article on identity. I wonder if those catastrophizing (I can’t believe I’ve actually typed that word) thoughts are due to the belief that if something is not true, then our identity is somehow different as a result?

Good points all around – and Scott’s quote from The Boxer is icing on the cake! Much of what we believe does turn out to “lie in jest.”

I might add that it can often be quite helpful to identify ideas that we want to be true so badly that it creates anxiety for us to imagine them to be false. Many cognitive psychologists have found this is due to “catastrophizing,” or believing that if something is not true, it will be “the end of the world,” so to speak.

These “end of the world” beliefs can drive us to overlook disconfirming evidence because it can feel like our life as we know it will end if we absorb evidence on the other side of our current belief. Replacing deep-seated thoughts like “if this isn’t true, my life is ruined” with less intense thoughts like, “if this isn’t true, it will be disappointing, but I’ll live,” can be helpful to overcoming this bias.

Thanks again, Jack – more rock solid stuff that I’ll be passing along to others for their consideration.