Psychology & Neuroscience Stack Exchange is a question and answer site for practitioners, researchers, and students in cognitive science, psychology, neuroscience, and psychiatry. It only takes a minute to sign up.

$\begingroup$Thanks, I've saved on Amazon whishlist. Have you read Social Intelligence of Daniel Goleman?$\endgroup$
– ReviousMar 12 '14 at 18:47

2

$\begingroup$If you're looking for a layman's understanding of this phenomenon, try reading The Believing Brain by Michael Shermer. I found that reading it actually helped me formulate better arguments.$\endgroup$
– BaldrickMar 13 '14 at 1:31

3

$\begingroup$Try thinking about the inverse of this question: isn't it amazing that we, as a species of apes, can (sometimes, with effort, on a good day) use abilities evolved through hunting, foraging, fighting, feeding etc to perform logic and rational argument? And sometimes, we humans can even follow rational logic when it risks conflicts with powerful group leaders, goes against our individual interests, or costs valuable energy re-assessing assumptions that have worked fine until now?$\endgroup$
– user56reinstatemonica8Mar 13 '14 at 10:00

5 Answers
5

You may be thinking of the Backfire Effect. When presented with logical and rational evidence disputing a strongly-held belief, most people's natural tendency is to hold on even tighter to those beliefs rather than to reassess their position.

As for why it happens... that's a matter of some debate (surprise, surprise), but the general thinking seems to be that it's a mental reaction to avoid cognitive dissonance. The page I linked contains references to several studies on the subject, though as you might expect, the results vary.

$\begingroup$I haven't had time to thorough digest all the answers, but yours really resonates with me, not to mention the fact that it's so concise. I'm a political activist who has long tried to figure out what makes people "sheeple." I have personally experienced the pain that comes with adjusting to reality after growing up in a world dominated by Walt Disney movies and government/media propaganda. However, I wasn't aware of the term "backfire effect," which I'm going to check out right now.$\endgroup$
– David BlomstromJul 28 '17 at 1:47

I'm not a cognitive science expert, but I happen to have some experience in change management / trying to convince people. There are in fact a lot of logical reasons for refusing logical and rational arguments.

Suppose someone tries to share an idea with you.

I will pass on the very obvious problems of "not befitting my interest" / "triggering culpability".

A VERY common problem is the fallacy that, "If I am wrong, then my opponent is right." If you hate your opponent (or whatever maybe impolite person is trying to tell you to change your ideas, e.g. your "bad way of living", or arrogantly seeing you as a miseducated inferior), then you'd rather die than accept that idea, which would $\rightarrow$ (fallacy) prove him right and thus $\rightarrow$ (fallacy) superior to you.

Another common problem (colleagues!) is, "I must show I'm the most intelligent." If someone tries to challenge one of your ideas, then it $\rightarrow$ (fallacy) makes him more intelligent than you, since $\rightarrow$ (fallacy) you were wrong. You will then try to find whatever counter-argument you can to prove him wrong, (fallacy) $\rightarrow$ thus reasserting yourself as more intelligent.

Another common but more subtle problem is that we all unconsiously build a representation of the world so that we can think on how to adapt to it (evolution). And of course we stabilize this representation when it fits correctly $=$ when we are in no immediate danger and also when we are in a comfortable position ;) Example: "The society rules are that people who adhere to the dominant opinion are more likely to succeed" (adaptation to environment) $+$ "People who don't share this dominant opinion are fascists/integrists/extremists, etc., who are a minority anyway" (rationalization of why I adhere to it, advantageous position). Now suppose that an idea challenges your representation of the world. Then it implies $\rightarrow$ that you are no longer in a superior position, and $\rightarrow$ that the rules of the world may not be what you thought, and that $\rightarrow$ you may be bad at understanding the world, thus in a weak position for planning corrective actions ($=$ adapting). Such a calling into question is incredibly painful! To evolve from this situation, one needs to experience full grief (as in "Five Stages of Grief").

I speak from experience. I once had the mindset, "Rules of the world are following religion X" $+$ "Those who don't follow religion X are pagans/heretics/whatever bad people" (but it's the same even outside religion). The challenge was when I realized after an event that $1)$ I was not that good, and $2)$ "pagans/heretics" were actually better than me on this subject. This triggered a very painful 5-stage grief process over more than 6 months where I suffered a lot before re-establishing a new, more correct vision of the world fitting those new pieces. I still needed a few years to fully unwrap things. From my experience, it seems I was lucky to be able to do that since it's not a given (cf. all possible failures of the 5-stages). And seeing backward, it was not the fault of religion X at all. I had built this simple and a bad representation myself to avoid facing some unpleasant truths.

Another common problem is the incorrect mixing of opinions and culture, which fallaciously leads to: idea challenged $\rightarrow$ (fallacy) culture challenged $\rightarrow$ (fallacy) our identity is challenged $\rightarrow$ (fallacy) our ancestor's work and worth challenged. Who would want that?

What you should understand is that the mind needs some security and self-confidence. All the mechanisms I exposed are mostly subconscious and argument rejection results mostly from an auto-defense of the mind, though we still have partial control over it. (I'm sure you have examples of people who honestly face the truth and others who never want to face it.)

If you encounter people that have realized all those points, discussion is easy even if you are not polite or your ideas are not well refined. They will look at the good points, accept the ideas and even help you refine your ideas. This is how everyone should be!

For other people, I've found that challenging ideas is better done when:

you are polite and non-threatening

you first outline the strength and qualities of the people you speak to their culture $+$ their ancestors

you clearly separate the challenged ideas from culture and show similar idea change by good/respected people in their own culture

as much as possible you make the new idea look like it comes from themselves.

Oh, and one last thing: humility. Everyone trying to spread an idea obviously thinks of it as "logical and rational", so if it's not accepted $\rightarrow$ (wrong!) it means that people are not rational $\rightarrow$ (wrong!) people need to be reeducated $\rightarrow$ (wrong!) there are some horrible opponents who fight my ideas, I must suppress them. Such thoughts led and are still leading to atrocities. Maybe it's just you who are wrong, illogical and irrational ;)

$\begingroup$4. reminds me of Feynman's "What I cannot create, I cannot understand." I wonder if there is vast misunderstanding on this topic, in the form of a conflation between "make the other person think that they discovered the idea and not you" and "help the other person generate the idea from their own belief system". Thoughts? It's fascinating because from a Christian perspective, this is comes out in the difference between doing what God says because he said so, and doing what God says because he is describing what will lead to truly good futures. I find the latter has more normative power.$\endgroup$
– labreuerMar 26 '14 at 23:51

Because Enlightenment-era rationality, with its values of liberalism and rationality and progress, has a very bad (read: hypocritical) historical track record of irrational, inegalitarian, maximally exploitative colonialism. Phrenology and Orientalism are among its crown jewels. The first chapter of Immanuel Wallerstein's World-Systems Analysis: An Introduction, dedicated entirely to post-Enlightenment knowledge-structures, is a very good read on this.

Because all too often in relationships, "rationality" is a fantastic card to play for manipulative, abusive boyfriends/girlfriends everywhere. Captain Awkward is a fantastic blog for this, search its entries for "rational" sometime.

Because rationality is often a function of education, and education is often a function of wealth.

Because the true rationalist and the true narcissist both claim to be completely rational. Rationality is a very widely held virtue, for better and for worse.

Because the Newtonian/Euclidean/Cartesian foundations of our understanding of the world are in serious question right now, in this era of quantum mechanics and chaos theory. What if the universe itself was irrational? Nonrational? Extrarational?

People aren't stupid. An intuitive distrust of the institutions associated with "rationality" isn't necessarily irrational.

I am not an expert, but I have given this question a fair bit of thought. I would suggest that the tendency to oppose rational arguments in favor of maintaining beliefs depends upon what beliefs we are speaking of. There are many beliefs that do change, but these tend to revolve around mere fact.

When you look at deeper belief, much of it is based upon self image. People often decide what type of person they are (or want to be) and then adopt the beliefs they think that type of person is supposed to have, complete with the standard rationales.

So, when you challenge their belief, you are not just challenging some rational thought, but rather challenging how their world works and who they are. When that happens, there is considerable cognitive dissonance, and people start doing mental gymnastics to find a way out of the situation.

Words are vague. People can call something investment even if no money is invested.

It's much more effective to read people's motives than to read their words.

Humans are self seeking rational assholes. Reasoning are often done for moral reasoning. The thing is none of us are too moral. We do the moral acts for indirect benefit of avoiding punishments. Saying something is wrong means absolutely nothing to motivate a person unless he see a disadvantage of that.

Let me share you an experience about how a lawyer manage to sell me an insurance.

I am already afraid of buying insurance. So many people says that insurance agent are liars.

However, I got a lawyer and I only want to buy a little bit.

I asked the lawyer, I want a small insurance. The rest of the money should be invested. They agreed.

I asked them, "How much of the money is for investment and how much for insurance." The lawyer said all investment.

One year latter I learned that I lost half of the money for "fees". The lawyer simply said it's all investment. Unit link is long term investment.

So basically I ended up buying a small insurance at 100 times of normal price. The money I thought are invested are gone for that irrational fee. I filed a lawsuit for that.

He is in a very stretch sense, technically correct. The insurance is small. The benefit is small. Even the insurance company claimed the same thing. Unit link is long term investment. It's an investment where 100% of the money is spent on fees. But technically it's still an investment.

Here, a "reasoning" that "it's all investment" is absolutely useless. It doesn't mean what I think it means. It makes me throw money away.

And that leads to what typical problems we have on "reasoning"

I would give another sample.

Bob sees money on table. I said don't steal it's wrong to steal.

What really happens is that humans are rational assholes. Bob would steal if and only if he won't get caught. However, humans like to pretend that they are "moral" people. Humans like to pretend that they wouldn't steal not because they might get caught but because they are a moral person.

Here, any rational reasoning of whether stealing is wrong or not will be totally irrelevant. You can say stealing is wrong. Bob will say that it's government's responsibility to avoid corruption. Most of us would break the law and find some justification to do it. So what's the point of reason?