Author
Topic: Our brains fail (Read 577 times)

Since I've come to this site I've observed many ways our brains fail and I've become interested in understanding them. A recent study found that when we do math in the context of deeply held beliefs, we fail at a much higher rate if the answer contradicts those beliefs.

Everybody knows that our political views can sometimes get in the way of thinking clearly. But perhaps we don't realize how bad the problem actually is. According to a new psychology paper, our political passions can even undermine our very basic reasoning skills. More specifically, the study finds that people who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.

I assume it would be exactly the same if religion were the field of belief instead of politics.

My conclusion, as always, is there is no hope for us. This is how we evolved. The only solution is to evolve different brains.

I wonder how much it is "political stance will prevent you giving a considered correct answer" as opposed to "political stance will lead you to jump to a conclusion without doing the calculations"?

Like the study said, "This is no easy problem for most people to solve: Across all conditions of the study, 59 percent of respondents got the answer wrong". I didn't have the interest to do the calculations - if I had a horse in the race I might "see" what I wanted without actually trying to compare at all. I need to read the study detail a bit closer to see if that is addressed, a bit pushed for time now!

I'm lucky. I can't do math at all. So when I'm wrong, it's for the right reason. My evolution is complete !

The list of things that we depend on our brain for that we shouldn't it quite long. But we can't yet abandon brains, so we need to find another way to stop fooling outselves. Which is probably not possible. But hey, at least we have our memories...

Hm, looking at the problems they used, I can actually see why people might fall for this. I've had to deal with a lot of people who seem reasonably intelligent overall but who let their biases get in the way of their reasoning.

For that matter, I could see myself falling prey to such assumptions, at least some of the time. I don't think I would have gotten the math wrong - except where I would have legitimately gotten it wrong[1] - but I would be more likely to double-check a result that I disagreed with than one I agreed with. That's almost exactly what they say in the study.

So, I guess the conclusion is that we should strive to check answers we agree with as fervently as we check answers we disagree with.

EDIT: In relation to the memory thing, I got into a somewhat heated argument with my roommate the day before yesterday regarding something that we remembered differently. We both agreed on the actual events, but disagreed on when they happened. As it happened, I was the one not remembering correctly (though I had to check with someone else to make sure). Which is not especially surprising, since it isn't the first time that I've gotten something like that mixed up in my head. It's been a stressful five months, too, which doesn't help.

Organized religion is simply tribalism with a side order of philosophical wankery, and occasionally a baseball bat to smash the kneecaps of anyone who doesn't show proper deference to the tribe's chosen totem.

For that matter, I could see myself falling prey to such assumptions, at least some of the time. I don't think I would have gotten the math wrong - except where I would have legitimately gotten it wrong[1] - but I would be more likely to double-check a result that I disagreed with than one I agreed with. That's almost exactly what they say in the study.

So, I guess the conclusion is that we should strive to check answers we agree with as fervently as we check answers we disagree with.

Emphasis mine. That is probably the best idea to take from this. I would even go one further and say that we should be more diligent about checking claims or conclusions that we agree with. The lies that feel like lies are the ones we're most likely to question, expose, or even get angry about, and thus, are also the ones that pose the least real threat to our grasp of reality. The truly dangerous lies are the ones that fit our existing beliefs so well that we don't suspect. Those are the lies that will fool us into believing false things. Sometimes even monstrous things. And this can be as true for honest errors as it is for deliberate lies.

For that matter, I could see myself falling prey to such assumptions, at least some of the time. I don't think I would have gotten the math wrong - except where I would have legitimately gotten it wrong[1] - but I would be more likely to double-check a result that I disagreed with than one I agreed with. That's almost exactly what they say in the study.

So, I guess the conclusion is that we should strive to check answers we agree with as fervently as we check answers we disagree with.

funny--I just put up a link talking about a study as to why people reject science, and you sound like you're talking about the article!

http://whywontgodhealamputees.com/forums/index.php/topic,25422.0.htmlEmphasis mine. That is probably the best idea to take from this. I would even go one further and say that we should be more diligent about checking claims or conclusions that we agree with. The lies that feel like lies are the ones we're most likely to question, expose, or even get angry about, and thus, are also the ones that pose the least real threat to our grasp of reality. The truly dangerous lies are the ones that fit our existing beliefs so well that we don't suspect. Those are the lies that will fool us into believing false things. Sometimes even monstrous things. And this can be as true for honest errors as it is for deliberate lies.

Logged

Organized religion is simply tribalism with a side order of philosophical wankery, and occasionally a baseball bat to smash the kneecaps of anyone who doesn't show proper deference to the tribe's chosen totem.

You know, I don't think that the article suggests that our minds are a failure. Our minds are complex. When we process information, we don't see that information in a vacuum. Everything else that we "know" influences the input.

Many centuries ago, people looked at the cycles of the moon, and saw a disk that got smaller or bigger every day. When I look at the moon, I see a huge celestial body that is usually partially covered by the shadows of the earth. I SEE the shadow because I know it is there. Previous generations SAW the bite marks from the serpent who was chewing away at the disk.

I spend a lot of time looking at data. At work, I look at similar sets of data year after year, and I pretty much know what I expect to see. If there is an anomaly, my first assumption is that there is a data entry error. So I make my staff crazy going over the data again, and running reports to find the error. There must be some SET of data somewhere that just didn't get input! There must be something wrong with a formula somewhere. If, after having made my staff's life miserable, the anomaly is still there, I need to accept it, and try and figure out what it means. What was different this year? What factors did I fail to take into account?

When the data "looks right" I don't put it under the same sort of scrutiny.

Of course our brains process information in the context of everything else that we know and understand.

Of course our experiences and beliefs and knowledge base impact on how we input information that our brains receive. If we didn't, we would wake up every morning testing gravity, like toddlers who throw cereal bowls off their high chair, just to see what is going to happen.

Political beliefs are deeply engrained. As are religious beliefs. And repeated sensory experiences, for that matter.

This is not a flaw. We are programed to build on what we "know."

It is, however, the best argument for peer review. And it should serve as a reminder to us that we need to reassess our deeply held beliefs when the evidence contradicts them.

I think the moral of this story is, when working with data, always give it your full scrutiny even if it seems to agree with what you're saying. Doesn't mean you have to be critical of everything, but when you think you're right about something, it's much easier to miss a mistake.

Since I've come to this site I've observed many ways our brains fail and I've become interested in understanding them. A recent study found that when we do math in the context of deeply held beliefs, we fail at a much higher rate if the answer contradicts those beliefs. My conclusion, as always, is there is no hope for us. This is how we evolved. The only solution is to evolve different brains.

A lot of people suffer from a form of solipsism. They believe that their view of reality is the only viable one and that everyone wants to be like them. They argue that if only those other idiots really understood, then they would agree with them and the world would run like clockwork.

Occasionally, one of them gets into power, and that is why we have North Korea and the Vatican.

The problem goes beyond confirmation bias and cognitive dissonance: our brains are wired in a way to make us feel comfortable in ourselves. Anything that requires us to move out of the comfort zone also requires effort and discomfort - and nobody like that, do they? It's much easier to persuade others... surely If I explain, they will agree? (Let's disregard the fact that they might not like the effort and discomfort.) I can explain and that's a lot easier than my changing, isn't it?

So if all you cretins will just listen up and agree with me, we can cure the world's problems by Friday, OK?

Logged

Nobody says “There are many things that we thought were natural processes, but now know that a god did them.”