Posts Tagged ‘Confirmation Bias’

One is the power of mere repetition.
How often have people heard or read it: Climate change is a hoax. Islamic terrorism is a grave threat to the United States (never mind that, of 230,000 murders since 9/11, only 123 have been perpetuated by Muslims. Moreover, Islamic terrorists have killed many orders of magnitude more fellow Mulsims then Christians.
Mere repetition makes statements easier to process and remember. The power of familiar, hard-to-ease falsehoods and fake news is appreciated by political manipulators, from those in Orwelll’s “1984” to those running today’s presidential campaigns. Unfortunately, rebuttals sometimes backfire because they repeat the myth.

The power of confirmation bias.
In a may 2016 national survey, those favorable to Trump believed Obama was Muslim rather than Christian by a 65% to 13% margin. Those unfavorable to Trump believed the reverse by a mirror image of 64% to 13%.

The power of cognitively available anecdotes.
A brutal crime may make the world seem more violent that it actually is. Do not believe anecdotes. Insist upon data and statistics. Remember, the plural of anecdote is not data.

The power of group polarization. Groups tend to have similar beliefs. Indeed that is likely why the group has formed. However, do not blame this on the internet. (See the healthy memory blog post, “The Truth About the Internet.”

To understand why misinformation spreads as well as how to counter this misinformation. Kahneman’s Two Process View of Cognition can be quite helpful. System 1 is named Intuition. System 1 is very fast, employs parallel processing, and appears to be automatic and effortless. They are so fast that they are executed, for the most part, outside conscious awareness. Emotions and feelings are also part of System 1. Islamophobic responses are essentially System 1 responses. Learning is associative and slow. For something to become a System 1 process requires much repetition and practice. Activities such as walking, driving, and conversation are primarily System 1 processes. They occur rapidly and with little apparent effort. We would not have survived if we could not do this types of processing rapidly. But this speed of processing is purchased at a cost, the possibility of errors, biases, and illusions. System 2 is named Reasoning. It is controlled processing that is slow, serial, and effortful. It is also flexible. This is what we commonly think of as conscious thought. One of the roles of System 2 is to monitor System 1 for processing errors, but System 2 is slow and System 1 is fast, so errors to slip through.

Moreover, System 1 is the default mode of processing. So when you read or hear something, the default mode is to believe it. Then further repetitions serve to consolidate this belief. Remember that one of the roles of System 2 is to monitor System 1 for errors. However, to do so require cognitive effort, thinking. So it pays to examine your beliefs carefully to see if they’re justified and whether they should be disbanded or modified.

This post is largely based on an article by David G. Myers titled Misinformation, Misconceptions, and our Teaching Mission in the Association for Psychological Science publication “Observer, September 2017.

“Denying the Grave: Why We Ignore the Facts that Will Save Us” is the third of three books of three books to be reviewed from an article titled, “That’s What You Think: Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

The authors of this book are a psychiatrist, Jack Gorman, and his daughter, Sara Gorman, a public health specialist. They probe the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false, but also potentially deadly, like the conviction that vaccines are hazardous.

The Gormans argue that ways of thinking that now seem self-destructive must at some point have been adaptive. They dedicate many pages to the confirmation bias, which they claim has a physiological component. This research suggests that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. They observe,”It feels good to ‘stick to our guns’ even if we are wrong.”

The Gormans do not just want to catalogue the ways we go wrong; they want to correct them. Providing people with accurate information does not seem to help; people simply discount it. They write that “the challenge that remains is to figure out how to address the tendencies that lead to false scientific belief.”

“The Enigma of Reason” by Hugo Mercier and Dan Sperber is the first of three books to be reviewed from an article titled, “That’s What You Think: Why reason and evidence won’t change our minds” by Elizabeth Kolbert in the 27 February 2017 issue of “The New Yorker.”

Ms. Kolbert notes hat since research in the nineteen-seventies revealed that we humans can’t think straight and that reasonable—seeming people are often totally irrational, the question remains: How did we come to be this way. “The Enigma of Reason” is the first book to be discussed that attempts to address this question. The argument of Mercier and Sperber is that our biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and also difficult to sustain. They argue that reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; instead it developed to resolve the problems posed by living in cooperative groups.

Mercier and Sperber write “Reason is an adaptation to the hyper social niche humans have evolved for themselves.” Habits of mind that seem to be weird or goofy or just plain dumb from and intellectual point of view prove shrewd when seen from an “interactionist” perspective.

They use confirmation bias to further their argument. This is the tendency we have to embrace information that supports our forms of faulty thinking. “Confirmation bias” is the subject of entire textbooks’ worth of experiments. One of the most famous was conducted at Stanford. Researchers rounded up a group of students who had opposing opinions on capital punishment. Half of these students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

These students were asked to respond to two studies, which the students did not know had been made up. One of these studies was pro and the other was anti and presented what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence highly credible and the anti-deterrence data unconvincing. The students who’d originally opposed capital punishment did the reverse. At the end of the study, the students were again asked about their views. The only difference was that this time the students were more in favor of their original views than they had been originally.

To further their point Mercier and Sperber suggest what would happen to a mouse that thinks as we do. If such a mouse were bent on confirming its belief that no cats were around, he would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or under appreciated threats, it’s a trait that should have been selected against. The fact that we both have survived, Mercier and Sperber argue, proves that it must have some adaptive functions, and they maintain that that function is related to our “hypersociability.”

Mercier and Serber prefer the term “myside bias” to “confirmation bias.” They point out that we humans are not randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. In an experiment illustrating this post by Mercier and some European colleagues participants were asked to answer a series of simple reasoning problems. Then they were asked to explain their responses, and were given a chance to modify them. Only fifteen% changed their minds in step two.

In step three, participants were shown the same problems, along with their answer and the answer of another participant who had come to a different conclusion. However, the responses presented to them as some else’s were actually their own and vice versa. Only about half the participants realized what was going one, Among the remaining half, suddenly people became much more critical. Almost 60% rejected the responses they’d earlier been satisfied with.

The title of this post is identical to the title of a column written by David Ignatius in the 5 August edition of the Washington Post. Ignatius began his column by asking, “How did Donald Trump win the Republican nomination despite clear evidence that he had misrepresented or falsified key issues throughout his campaign?” Also read or reread the healthy memory blog posts “Donald Trump is Bending Reality to Get Into the American Psyche” and “Trick or Tweet or Both? How Social Media is Messing Up Politics.” Trump makes outrageous statements, contradicts himself, and betrays a woeful ignorance about government and international relations, and makes claims that he is going to fix problems without providing any plans as to how he is going to fix them. Nevertheless, people say that they are going to vote for him. When pressed they say that are unhappy with current politics and the country is going in the wrong direction. To this HM asks, so the bridge is crowded and slow moving, does that mean you are going to jump off the bridge, even though you don’t know that you’ll survive the jump or that you might be eaten by the crocodiles in the water?

There have been prior posts about the confirmation bias and the backfire effect. The confirmation bias refers to our bias to believe statements or facts that are in consonance with our beliefs. The backfire effect refers to the effect when efforts to correct misinformation actually strengthen beliefs in the misinformation. Ignatius is referencing an article by Christopher Graves in the February 2015 issue of the Harvard Business Review. Research by Brendan Nyhan and Jason Reifer showed the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006 after the United States had publicly admitted that they didn’t exist. They concluded “The results show that direct factual contradictions can actually strengthen ideologically founded factual belief.

Graves also examined how attempts of debunk myths can reinforce them, simply by repeating the untruth. This study in the Journal of Consumer Research is titled “How Warnings About False Claims Become Recommendations. It seems that people remember the assertion and forget whether it’s a lie. The authors wrote, “The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.”

Graves noted that when critics challenge false assertions, say, Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell—their refutations can threaten people rather than convince them. And when people feel threatened, they round up their wagons and defend their beliefs. Ego involvement generates large mental efforts to defend their erroneous beliefs. Not only does the Big Lie Work, but small lies also work

Social scientists understand why the buttons that Trump’s campaign pushes are so effective. “When the GOP nominee paints a dark picture of a violent, frightening American, he triggers the “fight or flight’ response that is hard-wired in or brains. For the body politic, it can produce a kind of panic attack.

So attempts to correct misinformation can backfire and have the opposite effect. So what can be done? Some possible approaches will be found in the next HM post.

Mindware is a term Stanovich uses to refer to specific skills or knowledge that have been acquired through learning.1 You can think of it as software or an application program that you have acquired for the mind through learning. Scientific reasoning is one kind of mindware. Stanovich writes of “mindware gaps,” which can refer either to the lack of knowledge or knowledge that is not used. One mindware gap is a failure to consider alternative hypothesis. This failure is quite evident in police work. Consider the case involving the missing Chandra Levy and the Congressman Gary Condit. The police honed in on Gary Condit as the chief suspect in the missing and later the death of Chandry Levy. The alternative hypothesis that someone else did it was not actively considered. During the investigation other women were murdered in the same park by the same individual who eventually was convicted of Chandra Levy’s death. I hope this mindware gap is due to cognitive miserliness rather than to a true knowledge deficit. The need to consider alternative hypothesis needs to be central to investigative techniques.

Consider the following data regarding medical treatments:

200 people were given the treatment and improved

75 people were given the treatment and did not improve

50 people were not given the treatment and improved

15 people were not given the treatment and did not improve

Do you think the treatment was effective?

Many people think that the treatment was effective since 200 people given the treatment improved, whereas only 75 people who were given the treatment did not improve. However, the conclusion regarding the effectiveness of the treatment requires a control group in which people were not given the treatment. Of the 65 people in the control group, 77% improved. Of the 275 people in the experimental treatment group , 73% improved. As you can see there was improvement in both the treatment and the control groups. There is no support for the treatment being effective.

1Stanovich, K. E. (2009). What Intelligence Tests Miss: the psychology of rational thought. New Haven: The Yale University Press.