The Introspection Illusion

We’re great at spotting biases in others, but absolutely incompetent at finding them in ourselves. Even if we know exactly what to look for and we’ve got a ton of intellectual humility, noticing the effects of our own biases on our own thoughts is like looking for colored glasses while wearing them.

The introspection illusion is a cognitive bias in which we think we understand our own mind (and therefore find other minds to be unreliable). It’s a kind of backwards justification. We have a feeling about something, and then rationalize why our feeling would be justified. We think our intuitive and irrational impressions are things we came to by deep, reasonable thought.

When we feel like we’ve got a strong, rational argument for our own thinking, any different opinion appears to be obviously ill-reasoned, or even sinister.(1) For example, you hear a fact about a field outside of your study that sounds ludicrous to you. At this point, you don’t realize that you’re building a strawman from your own misunderstanding about the fact. Out of context, it appears to have no support and just be crazy. It doesn’t seem like a hasty generalization to assume that the whole field, believing something this crazy, must therefore be fundamentally flawed. Clearly, they have not thought this through. We, on the other hand, have.

Here, the introspection illusion took hold. It started with a gut reaction to something, and as the brain built its own support for the conclusion it had already reached, the feeling snowballed into a rationalization. Our own thoughts seem clear and justified, so everyone else must just be crazy. Perhaps we should pity them, because they are victims of this terribly devious indoctrination into this totally bogus field of Gend– *ahem* I mean, this totally bogus field in this purely hypothetical example.

Recent atheist’s missteps aside. The introspection illusion seems to be a key culprit in a number of problems that have plagued the skeptics movement for years, as well as a reason why some students reject any hint of critical thinking in the classroom out of hand. Consider the poorly constructed arguments against evolution that are repeated even to this day, like dogs turning into cats or the continued existence of monkeys. These topics have an implication that some people don’t like. That negative feeling towards that implication immediately transforms into a backwards justification, arguing against a massive strawman, and the introspection illusion amplifies this feeling that we’re right and others are wrong to the point that we can’t even begin to pick apart the tangle we’ve gotten our minds into. The worst part is that, despite our belief otherwise, we can’t actually find the starting point in all of this. Whatever started this snowball is lost so far in the middle of it that we have no way to see it.

(1) Assuming that others know they are wrong and are promoting false information because they have evil intentions sounds a lot like what religious extremists, presuppositionalists, antivax advocates, and conspiracy theorists say, doesn’t it?