Last month, Brendan Nyhan, a professor of political science at Dartmouth, published the results of a study that he and a team of pediatricians and political scientists had been working on for three years.They had followed a group of almost two thousand parents,.........>snip<...The goal was to test whether facts, science, emotions, or stories could make people change their minds.

The result was dramatic: a whole lot of nothing. None of the interventions worked.

...but there is hope (of sorts) if you read on, plus an understanding that basic prejudices empower faulty memories that stroke said prejudices. Pretty obvious you may say (and I agree) but for those not convinced here's scientific data to further fail to convince you.

Logged

"...but on a lighter note, demons were driven from a pig today in Gloucester." Bill Bailey

You can't make people change their minds. They have to decide to do it.

None of those leaflets were geared to do that. They just presented various facts, which were then interpreted by the recipients based on what they already believed they knew. Honestly, it was almost bound to fail.

I'm surprised they didn't try to share positive stories about children who had vaccinations and lived a healthy life thereafter. It wouldn't have gotten to everyone, but it can't have had a worse effect than what they actually did.

EDIT - and sure enough, later on in that article, they talk about affirmation as a way to effectively combat false beliefs (even if they're totally unrelated to the affirmation, although it presumably works better if they are related). For example, they suggest that advocates of pasteurization talk about how children have benefited from drinking pasteurized milk, rather than emphasizing the misconceptions about raw milk. That's almost exactly what I suggested after reading the first five or six paragraphs from the report.

« Last Edit: May 21, 2014, 11:31:24 AM by jaimehlers »

Logged

Nullus In Verba, aka "Take nobody's word for it!" If you can't show it, then you don't know it.

You can't make people change their minds. They have to decide to do it.

None of those leaflets were geared to do that. They just presented various facts, which were then interpreted by the recipients based on what they already believed they knew. Honestly, it was almost bound to fail.

I'm surprised they didn't try to share positive stories about children who had vaccinations and lived a healthy life thereafter. It wouldn't have gotten to everyone, but it can't have had a worse effect than what they actually did.

EDIT - and sure enough, later on in that article, they talk about affirmation as a way to effectively combat false beliefs (even if they're totally unrelated to the affirmation, although it presumably works better if they are related). For example, they suggest that advocates of pasteurization talk about how children have benefited from drinking pasteurized milk, rather than emphasizing the misconceptions about raw milk. That's almost exactly what I suggested after reading the first five or six paragraphs from the report.

yes jaim, thanks, I really should've included what was the essential revelation. It doesn't mean that I don't resent the idea that facts are not enough to get people to change their minds. It is just more subtle monkey-cousin relationship processes. It seems that the gentle manipulation of offering potential positive outcomes is made more readily accessible/attractive by fulfilling the "what's in it for me?" criterium.

« Last Edit: May 21, 2014, 11:39:59 PM by kin hell »

Logged

"...but on a lighter note, demons were driven from a pig today in Gloucester." Bill Bailey

I resent it a bit too. People should be able to recognize the benefits of something without having to be sweet-talked into it. More to the point, they should be able to recognize the detriments of something, especially after someone goes to the trouble to point them out. However, resenting it won't change anything.

Logged

Nullus In Verba, aka "Take nobody's word for it!" If you can't show it, then you don't know it.

You can't make people change their minds. They have to decide to do it.

Okay, but how doe we make people decide to change their minds?

Sodium pentothal?

Logged

"When we landed on the moon, that was the point where god should have come up and said 'hello'. Because if you invent some creatures, put them on the blue one and they make it to the grey one, you f**king turn up and say 'well done'."

False beliefs, it turns out, have little to do with one’s stated political affiliations and far more to do with self-identity: What kind of person am I, and what kind of person do I want to be? All ideologies are similarly affected.

This touches on something I've observed with religious believers. It seems that they've become convinced that their chosen set of beliefs is what good people believe, and since they want to be good people, they want desperately to believe it. The motivation for their belief thus becomes entirely disconnected from whether the belief is true, but is driven instead by their perception that it is good.

Yes, I think that's true. There may be a sort of transference of trust involved. They started believing because they trusted the person(s) who taught them, and perhaps also feel a sense of being trusted by the same person(s) in return. A betrayal of the beliefs is then seen as if it is a betrayal of the source of those beliefs.

The reasons that people believe in religion (and absurd conspiracy theories, and insane political ideologies) are many, and those reasons can be woven together in complicated ways. This is why getting free from such beliefs can be a long, difficult, and painful process, if one gets free from them at all. And it's why such beliefs can seem oddly impervious to reason in an otherwise reasonable person.

A few years ago, I was working on a project under my carport when I was approached by three people who were walking around the neighborhood inviting people to their church. They were friendly and seemed well meaning, even if misguided, so I engaged them in conversation and politely told them that I do not believe in any of that, and that I have already been down that path and have no interest in revisiting it. They tried several different arguments to convince me to return to the fold. But every argument they offered centered on some supposed benefit of believing, in and of itself. The promise of Heaven. The threat of hell. The comfort of believing in life after death. Finding a sense of meaning. Being part of a community of believers. Etc. They offered not a single argument, not even a faulty one like the creationist bunkum that gets posted here, in support of their beliefs actually being true. Even after pointing this out, they still failed to come up with any argument (again, not even a bad one) favoring the truth of their beliefs, but only more arguments for the benefits of believing. It was as if truth was completely absent as a factor in why they believe what they believe. As if the thought hadn't even occurred to them.

This is why arguments based on hard evidence and sound reasoning so often fail to convince. Truth is not among their reasons for believing. It just doesn't factor in to it.

I sometimes think that the only way for skeptics to be effective at changing minds on a large scale may be to resort to using bad reasoning (emotion-loaded anecdotal stories, associating ideas with positive feelings, positive self-image, etc.) to lead people to correct, or at least rationally defensible, conclusions. But then I think that this is probably a bad idea, because if you change only the conclusions, not the mode of thinking, then people remain vulnerable to having those good conclusions displaced by bad ones by continuing with the same faulty thinking. It would be a never ending battle.

Maybe the real winning trick is to use bad reasoning to persuade people not of specific conclusions, such as evolution or that vaccines don't cause autism, but of the idea that truth should be the first and foremost factor in determining one's beliefs, and that evidence and sound reasoning are the best tools for evaluating truth. It may be that even if one reaches that point for wrong reasons, once one gets there, he/she can then use proper reasoning to stay there, thus breaking the chain of credulity.