The most obvious relevance is this: For those of us who don’t believe in it, religion clearly looks like a prime example of rationalization and justification of a mistaken belief. Religious apologetics especially. Since there’s no hard evidence in the world to support the beliefs, the entire exercise — all the explanations and defenses, all the “mysterious ways”es and “this part isn’t meant literally”s and “you just have to take that on faith”s — it all looks from the outside like one gigantic rationalization for a mistaken belief. It looks like a well-oiled mechanism for refusing to accept that you hold a belief — and have based your life and your choices on a belief — that is illogical and unsupported by evidence.

And it looks like a classic example of a social structure built to support one another in maintaining these rationalizations: supporting one another in rejecting alternatives, and repeating the beliefs to one another over and over until they gain the gravitas of authoritative truth.

(This is what I was trying to get at when I called religion a self-referential game of Twister. I dearly wish I’d read this book when I wrote that piece; it would have given me much clearer language to write it in.)

And the more contrary a belief is to reality, the more entrenched this mechanism becomes. The non-literal, science-appreciating, “God is love” believers are usually more ecumenical, better able to think that they don’t know everything and that different beliefs may have some truth and validity. It’s the literalists, the fundamentalists, the ones who deny well-established realities like evolution and the sanity of gay people and the geological age of the planet, who have the seriously entrenched rationalizations for their beliefs… and the powerful institutional structures for deflecting questions and evidence and doubt. (“Those questions come from Satan” is my current favorite.)

So that’s the obvious relevance.

But there’s a less obvious relevance as well. This is an important book for believers… but it’s also an important book for atheists. And not just as a source of ammunition for our debates.

It’s an important book for atheists because of its ideas on how to deal with people who are entrenched in rationalization — and how really, really not to. One of the most important points this book makes is that there are useful ways to point out other people’s rationalizations to themâŠ and some not-so-useful ways. And screaming at someone, “What were you thinking? How could you be so stupid?” is one of the not-so-useful methods. In fact, it usually has the exact undesired effect — it makes people defensive, and drives them deeper into their rationalizations.

Now, many atheists may decide that screaming, “How could you be so stupid?” is still a valid strategy. And in a larger, long-term sense, it may well be. If religion is the emperor’s new clothes, having an increasingly large, increasingly vocal community of people chanting, “Naked! Naked! Naked!” may, in the long run, be quite effective in chipping away at the complicity that religion depends on, and making it widely known that there is an alternative. Especially with younger people, who aren’t yet as entrenched in their beliefs. And it’s already proven effective in inspiring other atheists to come out of the closet.

In one-on-one discussions and debates, though, it’s not going to achieve much. And we need to be aware of that. If we’re going to be all rational and evidence-based, we need to accept the reality of what forms of persuasion do and don’t work.

But it’s not just important for atheists to read this book to learn how to deal with believers’ fallibility. It’s important for atheists to read it to learn how to deal with our own.

Atheists, oddly enough, are human. And we therefore share the human tendency to rationalize and justify our beliefs and behavior. No matter how rational and evidence-based we like to think of ourselves as, we are not immune to this pattern.

And of particular relevance, I think, is one of the book’s main themes: the human tendency to reject any and all ideas coming from people we disagree with. The more entrenched we get in a belief, the more unwilling we are to acknowledge that our opponents have any useful ideas whatsoever, or any valid points to make.

And I’ve definitely seen that play out in the atheosphere. I’ve seen an unfortunate tendency among some atheists to tag all believers as stupid; to reject religion as having nothing even remotely positive or useful to offer; to explain the widespread nature of religious belief by saying things like, “People are sheep.”

I donât exempt myself from this. I think I’ve mostly been good about critiquing ideas rather than people; but I have gotten my back up when I thought someone was being unfair to me, and have refused to acknowledge that maybe I was being unfair as well. And I’ve definitely fallen prey to the error of thinking, “give ’em an inch and they’ll take a mile”; of thinking that any concession at all is the first step to appeasement, and I have to stick to my guns like a mule. A mule with guns.

But this tendency isn’t helpful. The issue of religion and not-religion is already polarizing enough on its own, without us artificially divvying the world into Us and Them.

If I’m right, and religion really is (among other things) an elaborate rationalization for hanging on to a mistaken belief… well, that doesn’t make believers ridiculous and atheists superior. It puts us all in the same human boat. It puts religion in the same category as hanging onto ugly clothes and shoes that gave me blisters, for years, because I didn’t want to admit that I’d made a mistake when I bought them. It puts it in the same category as going through with a disastrous marriage, because I didn’t want to admit I’d made a mistake when I got engaged. It puts religion into a particular category of human fallibility… a fallibility that we all fall prey to, every day of our lives.

I’m not saying religion is okay. Let me be very clear about that. I think religion is a mistake; I think it’s a harmful mistake; and I’m not going to stop speaking out against it. And I’m not asking anyone else to stop speaking out against it.

But for my own peace of mind, I’m making a sort of New Year’s Resolution about cognitive dissonance. I’m resolving to be better about acknowledging when I make mistakes, and correcting them. I’m resolving to be better about acknowledging when people I disagree with make good points. And when I’m in one-on-one debates with people, I’m resolving to think, not just about why I’m right and they’re wrong, but about what kind of argument is likely to persuade them.

Share this:

Like my blog and my work? It's made possible by generous support from readers like you. You can support it with a donation to my tip jar (one-time or monthly), or by buying my books and saying nice things about them.

23 thoughts on “Defensiveness, Rationalization, Mulishness… What Does That Have To Do With Religion? Mistakes Were Made, Part 2”

So what did the book change your mind about?
“I’m resolving to be better about acknowledging when I make mistakes, and correcting them. I’m resolving to be better about acknowledging when people I disagree with make good points. And when I’m in one-on-one debates with people, I’m resolving to think, not just about why I’m right and they’re wrong, but about what kind of argument is likely to persuade them.”
I think that I try and do those things as much as I can anyway, but I haven’t read the book. (Not that I’m not tempted to.) I guess what I mean is, didn’t you think that those things were good ideas before you had read the book? I would guess your answer is probably along similar lines to eating or drinking unhealthily. You already know it is bad, but you need someone to point out just HOW bad before you actually change your mind about doing it.

I don’t know if the book you read got into it, but the issue of relativistic belief systems, individual reality tunnels, and quantum psychology have a lot to do with this topic.
Most atheists I have encountered presume that there is a single, fixed, ‘etic’ (external) reality, often fully reducible to entirely materialistic properties, that is fixed and absolute. This can never be proven of course; since each of us filter reality through our individual neurolinguistic grid we all experience it in unique ways.
While it is fine to talk about the scientific method and how it is the greatest tool we have for discovering what we can best, collectively, share as knowledge about the Universe, it does not, in and of itself, define the limits of that reality.
At a certain point you have to acknowledge that people who have an experience, unique to themselves, have a perfectly rational and logical reason to modify their beliefs to match their experience.
The simplest example of this kind of thing would be a UFO experience. Now, personally I have never seen a UFO and, to my knowledge, UFOs have never been reproducible on demand under laboratory conditions. This would suggest then that UFOs ‘are not real’.
However, if I were to have witnessed a UFO (not some distant blurry blob but let us say a structured craft, a flying saucer if you will, and it had not only been witnessed visually but had been touched, felt, and left a residual physical affect on the environment), then I would be crazy, illogical, and irrational not to believe in them.
Given this specific example we have two people; one who thinks it would be an irrational and illogical leap to believe in UFOs and another who would feel, justifiably so, just the opposite.
Once you fully embrace the implications of quantum psychology and try, more often, to employ Eprime in your speech, I believe we take a major step forward in how we communicate as a species.
John
(References to Quantum Psychology, etc. come from the works of Robert Anton Wilson.)

“So what did the book change your mind about?”
Good question. Here’s what it changed my mind about:
a) It pointed out a number of specific ways that this process works that I wasn’t aware before — thus, I hope, making me better able to recognize it.
b) It pointed out a number of specific strategies for dealing with the process that I hadn’t known about — thus, I hope, making me better able to cope with it.
And it also pointed out the degree to which this process is usually completely unconscious. That *was* news to me. I’d tended to think of rationalizations as the more conscious, deliberate variety. Knowing just how powerfully resistant the process is to consciusness and introspection is making me more vigilant about trying to recognize the signs of it. And maybe more importantly, it’s making me less critical, and more empathetic, when other people do it.
To some extent, it is, as you suggested, a matter of learning just how harmful this problem is, and just how difficult it is to address. But it’s also got many specific, pragmatic pointers on dealing with it that I hadn’t had before.

John:
I don’t deny that a person can have such an experience, what I deny is that their interpretation of said experience is correct. If something is entirely subjective and unverifiable, then it is useless. That person may wholeheartedly believe in it, but it is still useless.
Also, particularly relevant: if said claim defies what we know about the universe (I’m not talking about stuff that is remotely possible, I’m talking about claims that are simply ridiculous), then we have all the reason in the world to dismiss it as rubbish.
Thus the rub to your little tirade: if it is entirely subjective, and unverifiable, then skeptics are quite justified in dismissing it as nonsense. To do otherwise is to fall prey to all sorts of woo and bullshit.

“A mule with guns” … and coffee on the keyboard.
So without giving away the whole book, what is an example of a sort of strategy are you referring to for dealing with rationalization in others?
I believe I have gotten pretty good at spotting at least some of my own rationalizations (though I’m not at all averse to having additional strategies). However, by and large my success at helping other people recognize their own rationalization has been pretty hit and miss. I’m not talking about deconversion – just getting someone to see that their premises aren’t necessarily givens – in any number of situations. Sometimes I have been very effective, sometimes quite ineffective.
So I intend to pick up the book soon (next time I’m near a bookshop if possible), but for right now I’m curious to understand at least something of what you’re talking about.

Another great post! I must read that book.
Explaining religion as a defense strategy against cognitive disonance when beliefs don’t match reality is a useful way to go, but only so far. It can explain a lot, but like all theories of religion, it cannot explain everything. It is too complex a phenomenon for that.
I haven’t read it, but a book from several decades ago by a chap named Festinger (sp?) addressed cognitive disonance when apocalyptic predictions proved unreliable.
One guy on my thesis examination committee, Robert Carroll (now deceased) took up this idea and used it to explain religious change that resulted in rewriting early versions of what are now biblical prophetic books: The earlier “prophecies” did not come true, so the texts were edited, supplemented and reinterpreted to preserve their “reliability” and to project fulfillment onto the future.
I think what Greta-Christina is pointing to is a frequent strategy reinforcing belief,but also reinforcing communities when they are threatened by “outside” ideas. I’m not sure it is an explanation for religion per se. Religion seems to me to be more of a symbolic projection of community identity and values, that is constantingly evolving and changing while affirming its own timelessness. There are many processes of group boundary definition and maintenance. Teh bigger the percieved threat, the higher the walls surrounding the group get and the more desparate the need to maintain solidarity even in the face of “reality”.
Anyway, Greta-Christina well done, with a thoughtful post to start a cold Canadaian morning, and now for more coffee.
Cheers,
Jim

Greta,
I’m surprised at your response. The theme of this post, I thought, was about facilitating communication between people who hold differing belief systems.
I feel it is important to acknowledge the simple fact that each individual human being forms a belief of reality in a distinctly unique way.
As wonderful as the scientific method is, it does not sustain every belief that an individual holds. Their beliefs are always going to be formed from their personal experience first and the consensus view the latter.
Simply because someone has an experience that cannot be reproduced on demand under laboratory conditions does not make it any less ‘real’ to that individual.
I believe it is important to acknowledge this and grant that each individual is going to interpret reality in their own unique, and quite relativistic way.
Given your previous statement it seems to me that you believe strongly that reality is confined to what the scientific method can, and has, revealed about it. I might remind you that reality is under no specific restrictions to abide by these rules.
Last I checked, given our best understanding of modern physics, there is still no such thing as a thing (See: ‘The Matter Myth’ by Gribbon and Davies).
Who collapses the quantum wave function, or, is that an illusion in and of itself?
John

Very interesting commentary. This is why I so appreciated Sam Harris’s talk that he gave to the Atheist Alliance. When I made my break with religion I found it very hard to embrace atheism because to me it seemed like another mule(your great metaphor) of a different color.

John, that was my response. I’m well aware that people form their subjective interpretation of reality from what they experience, but, like I said, a subjective experience (a misfiring of neurons, irrational interpretation of an experience, whatever), that is not verifiable is likely not real. Not to say that it *cannot* be real, but, it likely is not. There’s a reason why there are skeptics, and I feel that skepticism is a necessary thing.
Hard-core skeptic, here.

Based on your recommendation, I picked up this book last night. Reading it, I was struck by how much their theory of cognitive dissonance and rationalization applies to political discourse on the web. In particular, I’ve noticed that people will trash someone on the “other side” for certain actions/behavior, while ignoring or giving a free pass to someone on their own side that does the same thing. Conversely, when someone attacks someone on the other side for a certain/action behavior, a defender from the other side will invariably answer with “how come you weren’t saying the same thing when your guy did it.” The attacker will either ignore this, or will come up with some reason why what their guy did wasn’t so bad/wasn’t the same thing, and the defender will often not actually defend the behavior of their own guy.
Of course, I’m not saying this always happens: there are some people who hold the people on their side to the same standards, and who don’t brush off attacks with counter-attacks or weak justifications. But I do see this a lot, and it very neatly follows the patterns laid out in the book.

David: I totally agree with you. And it’s not just discourse on the Web. It’s discourse, period. Heck, it’s life, period. I have definitely found myself cutting people a lot more slack if I like them, and cutting people no slack at all if they don’t. When I’m ragging on someone I don’t like for doing something that annoys me, I often have to stop and ask myself, “Would this have bothered me if anyone else had done it?”

“So without giving away the whole book, what is an example of a sort of strategy are you referring to for dealing with rationalization in others?”
Well, the authors explain this better than I can, and in more depth. But to give an example… Well, let me just quote. Here, they’re talking about what to do — and what not to do — if you have a relative who’s fallen victim to a con artist and is rationalizing themselves into believing that it’s not a con:
“Therefore, says Pratkanis, before a victim of a scam will inch back from the precipice, he or she needs to feel respected and supported. Helpful relatives can encourage the person to talk about his or her values and how those values influenced what happened, while they listen uncritically. Instead of irritably asking ‘How could you possibly have believed that creep?’ you say ‘Tell me what appealed to you about the guy that made you believe him.’ Con artists take advantage of people’s best qualities — their kindness, politeness, and their desire to honor their commitments, reciprocate a gift, or help a friend. Praising the victim for these worthy values, says Pratkanis, even if they got the person into hot water in this particular situation, will offset feelings of insecurity and incompetence.”
They also talk a lot about teaching children that it’s okay to make mistakes… and teaching them that making mistakes doesn’t reflect on their character. They have a whole section on how in America, we tend to think — and to teach our children — that intelligence and ability are natural, inherent character traits. So when we make mistakes, we tend to take it personally, to see it as reflecting on our innermost character. Other cultures see intelligence and ability more as something you acquire through hard work… so they’re more likely to see mistakes as part of that hard work, instead of a personal failing. So we can encourage each other (and ourselves) to see mistakes in this more positive light.

Strongly agree with your idea that atheists should be skeptic of their own beliefs and behaviours. I am a fan of Karl Popper’s idea that hypotheses never can be proven but only falsified. We atheists have the hypothesis that no god exists. Therefore we should take easy opportunities to find evidence for god(s), trying to falsify our hypothesis, and this requires discussion with theists being taken seriously. Of course, falsification will fail. You might consider this a waste of time but I think it will pay off in form of a better funded reasoning and a better ability to draw the undediced on our side.

Thanks, Greta.
Good example, because I see right away how that connects directly to reasons for someone rationalizing in the first place.
I guess if you understand what feelings led to the need to hold to a given rationalization, you can support the feelings independently of the rationalization, making it easier to examine and hence perhaps let go of the rationalization.

Great book — I read it about a month ago. The thing that made the most imnpression on me was how we become invested in our choices. How we can rationally compare several, quite similar items — but once we choose one we start recasting it in our minds as the BEST choice, oh so much better than those other inferior choices.
Books like this challenge you to be on guard against your own “instinctive” reactions. Another good one is “Don’t Believe Everything You Think.”

Incidentally, a lot of these ideas also appear in “Stumbling on Happiness” by Daniel Gilbert. His book is more about how we’re really bad at predicting the future because of the irrational mistakes we make when remembering, perceiving, or predicting.. but it’s a lot of these same topics. If you haven’t read it, I’d recommend it.

Hmm, yes, you’ve talked me into going out and getting a copy of this too.
Cognitive dissonance is so insidious, and it’s easy to take a holier-than-thou approach as many atheists do (should that be rationaller-than-thou?)
I think admitting your own mistakes and confusions and weaknesses can be a really a powerful way of getting others to look at the extent to which they do the same – often much more effective than shouting at them.
(I particularly like the World Question Centre’s collection of people talking about how they’ve changed their minds- http://www.edge.org/q2008/q08_index.html)
Anyway, thanks – must add you to my blogroll as there’s lots of interesting stuff here!

Greta,
Have you read Robert Cialdini’s “Influence — Science and Practice”? That was one of life changing books for me, the first that comes to mind when I think of “books everyone should read”. The full text is available online here. The book you described reminded me very powerfully of the chapter “Commitment and Consistency”. Actually, the reason I’m hesitant to buying “Mistakes Were Made” is because in all your review I haven’t found something that they say and Cialdini doesn’t… while he actually suggests one technique of fighting rationalization that I find very useful. It goes like this: try earnestly to distance yourself from the decision — ask yourself the question “If I were put back in time, would I honestly do the same thing again?” and then listen carefully to your first response, the one that comes from the heart of hearts.
It’s not a perfect idea, but a damn good one…
Concerning how to speak to someone who’s been rationalizing something for a long time, I think you’ll enjoy Steven Hassan’s book “Releasing the Bonds”, about helping people who were brainwashed by destructive cults. Excerpts are available on his site, http://www.freedomofmind.com — a great site about cults and fighting cult mind control.

Comments are closed.

The Orbit is a diverse collective of atheist and nonreligious bloggers committed to social justice, within and outside the secular community. For more information, please see our About Us page.

All content is copyright the authors except where otherwise noted. Contact the authors individually for further information.