Confirmation Bias: Workplace Consequences Part I

We continue our exploration of confirmation bias, paying special attention to the consequences it causes in the workplace. In this part, we explore its effects on our thinking.

World global temperature departures from the 1951-1980 mean. The plot shows four data sets, including NASA, NOAA, the Climate Research Unit (CRU) of the University of East Anglia, and the Japan Meteorological Agency. All are very similar, and although there are significant overlaps between the respective data sources of these four agencies, each is applying its own analytical methods to that data. The CRU is the focus of what is often called "Climategate" — an allegation that climate researchers there (and presumably elsewhere) have used their scientific skill to conspire to produce a conclusion that matches their preconceptions, abusing their positions and violating their scientific integrity. The advocates of the Climategate position allege, among other things, that confirmation bias is at work in the scientific community and that global warming, if it exists at all, is not due to human activity. Scientists, including recently a scientist who had joined with the global warming deniers, have repeatedly debunked these allegations.

Interestingly, it is the advocates of Climategate who may provide the clearest examples of confirmation bias. One example of a justification of this assertion relates to what is here called "Hear no evil, see no evil." Let us concede that, across any given field of scientific research, scientific integrity is rarely upheld perfectly and that in any area of scientific research, there are instances of scientific fraud. Two critical questions then become: "What is the incidence of scientific fraud, by field of research?" and "What is the expected incidence of scientific fraud in climatology?" Finding a single instance of scientific fraud does not in itself invalidate all research in a given field, nor does it invalidate the research done by others in that field, nor does it prove that the research performed in that field is any less credible than is the research performed in other fields. Any conclusion as to the credibility of climatological research must rest on some estimate of the integrity of the field itself relative to others. But Climategate advocates have done no such studies of which I am aware, or, at least, they have not publicized this element of the argument. For them it appears to be sufficient to call into question one group's work without establishing any kind of context for their subsequent conclusions. This could be explained as an unwillingness to test their conclusions, which is a hallmark of confirmation bias. Illustration courtesy The New Republic.

Confirmation bias is a cognitive bias that causes us to seek confirmation of our preconceptions, while we avoid information that might contradict them. It can also cause us to tend to overvalue information supporting our preconceptions, and undervalue information in conflict with them.

Last time, we explored indicators that confirmation bias might be taking place. Let's now explore how confirmation bias affects our thinking. Here are five ways people use confirmation bias — often outside their awareness — to reinforce their preconceptions.

Hear no evil, see no evil

We use many techniques for avoiding information that conflicts with preconceptions. In meetings, we deprive people of the floor if we regard their positions as threats to our preconceptions, or we're inattentive when they speak, or we place their agenda items last, hoping to run out of time. We ignore what they write and we distract others from paying attention to their contributions. For conferences, we use peer review to exclude them from programs altogether; we assign them to small, undesirable, or out-of-the-way venues; we schedule them for undesirable time slots; or we schedule them opposite events that we expect to be heavily attended.

Consider carefully the spectrum of information sources to which you do pay attention. When others make choices for you (as happens in conferences), think about what has been excluded from the program.

False memories

Most research about false memories relates to the recovered memory techniques in common use a decade ago. In the workplace, false memories also play a role. When we recollect what someone said or wrote, we're risking confirmation bias.

Take extra care when retrieving memories that support positions you hold — memory tends to provide data that supports what we believe, and it tends not to provide data that conflicts with what we believe.

False reasoning

When reasoning is subtle, we sometimes make mistakes. But when reasoning isn't especially challenging, we can still make errors, especially when we hurry or we're under stress.

Because of confirmation bias, these errors tend to favor our preconceptions. Examine carefully any reasoned argument that supports preconceptions.

False accusations

Accusations subjected to testing against evidence are more likely to be valid. By limiting such testing, confirmation bias tends to increase the likelihood that accusations are false.

Accusations need not be verbalized. Even when they're merely thoughts, they affect how we interact with the accused. Such accusations are especially prone to confirmation bias, because they are rarely subject to confrontation with evidence.

Tiptoeing around the elephants

The "elephant in the room" The "elephant in the room"often takes the form of information that contradicts our preconceptionsoften takes the form of information that contradicts what we hope to be true — information that contradicts our preconceptions. Confirmation bias helps us defend against these contradictions.

When you sense the presence of an elephant in the room, check for contradictions of the group's hopes, dreams, and preconceptions.

Are your projects always (or almost always) late and over budget? Are your project teams plagued by turnover, burnout, and high defect rates? Turn your culture around. Read 52 Tips for Leaders of Project-Oriented Organizations, filled with tips and techniques for organizational leaders. Order Now!

Related articles

Diversity of perspectives is one of the great strengths of teams. Ideas contend and through contending
they improve each other. In this process, criticism of ideas sometimes gets personal. How can we critique
ideas safely, without hurting each other, while keeping focused on the work?

Groups sometimes make mistakes based on faulty reasoning used in their debates. One source of faulty
reasoning is the ad hominem attack. Here are some insights that help groups recognize and avoid this
class of errors.

Toxic conflict in virtual teams is especially difficult to address, because we bring to it assumptions
about causes and remedies that we've acquired in our experience in co-located teams. In this Part II
of our exploration we examine how minimizing authority tends to convert ordinary creative conflict into
a toxic form.

Preventing toxic conflict is a whole lot better than trying to untangle it once it starts. But to prevent
toxic conflict, we must understand some basics of conflict, and why untangling toxic conflict can be
so difficult.

Forthcoming issues of Point Lookout

Coming March 21: Narcissistic Behavior at Work: III

People who behave narcissistically tend to regard themselves as special. They systematically place their own interests and welfare ahead of anyone or anything else. In this part of the series we consider how this claimed specialness affects the organization and its people. Available here and by RSS on March 21.

And on March 28: Narcissistic Behavior at Work: IV

Narcissistic behavior at work is more damaging than rudeness or egotism. It leads to faulty decisions that compromise organizational missions. In this part of the series we examine the effects of constant demands for attention and admiration. Available here and by RSS on March 28.

Are you a writer, editor or publisher on deadline?
Are you looking for an article that will get people talking and get compliments flying your way? You can have 500 words in your inbox in one hour. License any article from this Web site. More info

Public seminars

The Power Affect: How We Express Our Personal Power

Many people who possess real organizational power have a characteristic
demeanor. It's the way they project their presence. I call this the power affect. Some people —
call them power pretenders — adopt the power affect well before they attain significant organizational
power. Unfortunately for their colleagues, and for their organizations, power pretenders can attain
organizational power out of proportion to their merit or abilities. Understanding the power affect is
therefore important for anyone who aims to attain power, or anyone who works with power pretenders.
Read more about this program.

My blog, Technical Debt for Policymakers, offers
resources, insights, and conversations of interest to policymakers who are concerned with managing
technical debt within their organizations. Get the millstone of technical debt off the neck of your
organization!