Because I Said So! Why it pays to think about how we think.

Recently, during the reading of a highly controversial article, I was struck with people’s impulses to believe and rally on the side of things simply because someone tells them to; or because someone says “it’s a scientific fact”. This made me think of all the ways that I interact with information and how easy it is for me to almost always find supporting evidence for anything that I wish to believe is true.

Wanting to understand more about how I come to trust information as false or correct, I decided to go on a hunt for enlightenment into how our minds generate and come to the conclusions that eventually lead to our beliefs. What I found was a plethora of information on a concept called confirmation bias.

Confirmation bias, (or myside bias) is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities. (Wikipedia) Basically, we tend to pay attention to a selective set of information that supports what we already believe while ignoring or rejecting evidence that supports a different conclusion; even to the point of completely ignoring conflicting alternatives. Oh, you mean the way scientists create a hypothesis and then set out to find all of the information they can to support it? Hmmm…

One explanation for confirmation bias includes the act of wishful thinking and our limited human ability to process information (basically, the “you don’t know what you don’t know” syndrome). Another explanation is that people exhibit confirmation bias because they are weighing the “cost” of being wrong, rather than investigating in a neutral, scientific way. Some studies have stated that myside (confirmation) bias is the absence of "active open-mindedness," meaning the active search for why an initial idea may be wrong.

Here are some well-documented examples of how we habitually use confirmation bias to form our beliefs:

Confirmation has led investors to be overly confident, ignoring evidence that their (confirmed) strategies will lose money.

People suffering from phobias and hypochondria have also been shown to use confirmation bias for verifying threatening information.

Many times in the history of science, researchers have resisted new discoveries by selectively interpreting or ignoring unfavorable data.

In social experiments where people were given feedback that conflicted with their established self-image, they were less likely to pay attention to or remember it than when given self-verifying feedback.

So, what to do with this conundrum? Well, my “belief” is that in order to be the best and most well-rounded me I can be, I must let go of the concept that I can unequivocally or objectively know much of anything with any certainty. Rather, since my opinions and beliefs are much more a result of what I pay attention to, perhaps it’s time to start looking at that which hides just beyond, beneath, and behind my preconceived notions. After all, isn't that where possibility lives?

Will this take work? Yes. It also requires an open mind and the willingness to sometimes be proven wrong. But, if we look at it as a fun science experiment, then we can face this challenge knowing that time is on our side. After all, look at how many scientific “truths” have been born, evolved, and re-birthed throughout history. Disproving one toward the benefit of another is not a step in the wrong direction. Not at all. It’s called progress.