Cognitive Dissonance

Have you ever wondered why people get angry when they’re wrong? Not just feel uncomfortable or awkward, but pissed?

Last week, a customer came up to my makeup counter and demanded we honor her coupon that promised a free mini mascara. After I explained the coupon was expired and offered her a mini lip gloss instead, she threw the lipgloss back at me (a really poor choice of ammunition, considering it’s barely the size of my pinky finger) and told me it was a “gross pink”, and, besides, we are “a crappy ghetto store anyway”.

What this lady experienced was a form of cognitive dissonance. According to the American Psychological Association, cognitive dissonance is the mental discomfort caused by a contradiction between the actual truth and what a person believes is true: in other words, it’s the awkwardness, the collar-loosening, and the throat-clearing silence (or the lip-gloss-chucking anger) we experience when we’re wrong.

There are a few different categories within the broad heading of “cognitive dissonance”, and what my customer experienced is called the “effort-justification paradigm”, which occurs when a person can’t get what they want, so they decide they never wanted the item anyway. Aesop’s fable of “The Fox and the Grapes” is the most famous example of this: the fox really wants some grapes, but when he can’t reach them, he decides the grapes are probably sour anyway.

There are three other main paradigms within the theory of cognitive dissonance: belief disconfirmation bias, induced-compliance paradigm, and free choice paradigm. Without going into overly-scientific detail, a basic description of these paradigms:

The “belief disconfirmation paradigm” is probably the most familiar to us: it’s what happens when a person is confronted with proof their previous beliefs are wrong. This is most commonly seen during family gatherings with your racist uncle, who cannot handle statistical proof that minorities are not evil and will simply ignore the facts.

A famous experiment by scientists Elliot Aronson and James Carlsmith in 1963 showed that in absence of an external motivator (a positive one like money, or a negative one like the threat of punishment), humans needed to internally justify actions they were required to perform that they wouldn’t normally do. This type of justification–knowing that something is actually X, but being required by your job to tell people it’s really Y–is called the “induced-compliance paradigm”.

The “free choice paradigm” has to do with justifying our own choices. In another study by Elliot Aronson, he found that when forced to make a decision between taking home two products, participants always rated the product they chose higher than they did before the choice to take it home was presented. This phenomenon explains things like brand loyalty or always taking the same roads home, regardless of how good the brand actually is or if there’s a faster route.

Naturally, one of the most interesting parts of this theory is how people respond to situations that create cognitive dissonance. From what we’ve seen so far, cognitive dissonance is uncomfortable. It makes us feel dumb and insults our ego. So how do we handle it?

There are essentially two ways: we run away, or we fix it.

Running away is probably the most common reaction, especially with a wealth of slanted sources at our fingertips. The “running away” phenomenon can be seen in those who choose to only listen to their preferred bias: conservatives who only watch FOX News or liberals who only watch MSNBC and throw slander at the other side without knowing what’s really over there. Sometimes we “run away” without even realizing it: when you search Google for “was the Iraq war a failure or a victory?”, you’ll receive different results based on your political party, the news sites you frequent the most, and your age, ethnicity, and gender. Google knows how uncomfortable cognitive dissonance can be, so they tailor your results to fit what you want to hear, and they’re certainly not the only websites who use cookies to track what they should show you to make you happiest. You can choose to deliberately run away, as well: you can drop your friends who don’t share your worldview, you can choose to go to a university that only caters to your political affiliation, and you can isolate yourself from anything that challenges the way you see things.

According to social psychologist Leon Festinger, this type of “running away” is natural. After all, our natural evolutionary goal is to pick the environment that hurts us the least. And while this, of course, includes things like a warm and sunny living space, easy access to food, and plenty of opportunity for socialization, we are also evolutionarily designed to seek the most mentally comfortable environments we can find. By running away, we surround ourselves with more and more “evidence” that we’re right, and keep ourselves away from people who would challenge that.

There is a slightly less cowardly (and, one could debate, more evolutionarily advantageous) option as well: research. Some people will naturally respond with a desire to know more about a certain topic, according to Festinger. If, for instance, your political debate with your uncle gets heated and he gets out his phone to do some Googling—out of anger or out of curiosity—that would be a great example of Festinger’s theory of a research response to cognitive dissonance.

The big one, of course, is the practice of actually changing your mind when you come across new information: “fixing” cognitive dissonance. If you hear something that gives you cognitive dissonance, that familiar hum of awkwardness and uncertainty, you can embrace it, because you know this is an opportunity to learn something new about the world. You can fact-check the arguments presented and use one of the best things humans have—logic, reasoning, and a capacity for understanding truth—to determine what theory best aligns with reality. You can use the plethora of information we have at our fingertips not to find what you want to hear, but to find accurate, unbiased fact.

So as you can probably tell, I personally favor one of these responses more than the others. But I’ll leave it up to you to decide: when you present someone with new knowledge that proves their worldview wrong, who do you respect the most? The one who runs away? The guy who immediately whips out his phone to find statistics from a biased source to “prove you wrong”? Or the guy who calmly works through the evidence presented and comes up with a (new or old) worldview that takes all facts into account?