Doomsday Messages About Global Warming Can Backfire

Monday, November 22nd, 2010

Dire or emotionally charged warnings about the consequences of global warming can backfire if presented too negatively, making people less amenable to reducing their carbon footprint, according to new research from the University of California, Berkeley.

“Our study indicates that the potentially devastating consequences of global warming threaten people’s fundamental tendency to see the world as safe, stable and fair. As a result, people may respond by discounting evidence for global warming,” said Robb Willer, UC Berkeley social psychologist and coauthor of a study to be published in the January issue of the journal Psychological Science.

“The scarier the message, the more people who are committed to viewing the world as fundamentally stable and fair are motivated to deny it,” agreed Matthew Feinberg, a doctoral student in psychology and coauthor of the study.

The first white-coat-pocket-protector-egghead, Robb Willer, has a page over here with lots of other studies and research works listed, courtesy of some curious FARK member.

I don’t mean to attack the propeller-beanie-egghead, but I’ve been noticing this trend with “research” material and I think it’s a measurable one. Let me state it by means of example: I am doubting like the dickens anybody ever woke up some morning, be he a propeller-beanie-pocket-protector type or some normal person…and said to himself…”Self? I wonder how social order might be possible when individuals are tempted to behave selfishly? Maybe I should look into that and write a research paper about whatever it is I manage to dig up.”

I think papers like these begin with the finger-waggling lecture, and then all else is filled in around it. Can’t prove it, but not exactly going out on a limb on that one. And I think research papers like these represent a fairly new trend. Not an instantaneously new one; if you’re old enough to be my parent, and you’ve been working in the university system your whole life, you might have found this situation when you started: Papers written by eggheads who knew exactly what they wanted the paper to say, before they “discovered” a single thing. Exercises in confirmation bias.

And I’m no spring chicken myself. This seems to be a sixties thing. Maybe it’s all Rachel Carson’s fault. Just a thought; agree with her work or not, it’s definitely an example of what I’m talking about and I haven’t been able to find too many examples before Silent Spring. But who knows, maybe it goes clear back to Freud.

It certainly is a problem we should engage. All kinds of higher-education professionals like to dish out that tired old bromide, “I’m not here to teach you what to think, I’m here to teach you how to think.” That is a pledge, a promise of sorts, and it seems to me the rest of us are doing a fairly lackluster job of holding them to it. Whenever an egghead makes up his mind what a paper is supposed to conclude, and then starts gathering the data, that promise is being violated.

Oh, and as an aside: Tuning out, when someone starts trying to sell you on the idea that “the planet is doomed unless you do what I say”…is what intellectually healthy people do. Yeah, even when the facts aren’t all in yet. There’s such a thing as figuring out when someone’s trying to bullshit you.

As someone who worked in academia a long time, I can assure you that the line between “[p]apers written by eggheads who knew exactly what they wanted the paper to say, before they “discovered” a single thing” and “most master’s theses, dissertations, and journal articles” is so thin you can read a newspaper through it. In the humanities, at least, which is where the majority of these “studies” come from…

PS: this is anecdotal, but at the rather prestigious university where I got my first grad degree I had a lot of friends in the soft sciences, and I participated as a research subject in lots of their “studies” — they paid well, and in cash, for an hour’s work. You could tell exactly the conclusion they were trying to reach by the way they phrased the questions. Example: they’d ask about your political orientation, with choices ranging from “very liberal” to “conservative.” The next question would be something like: “Some races are simply inferior to others.” Any guesses as to the correlation they were shooting for?

PPS: I know that ” the rather prestigious university where I got my first grad degree” is horribly pretentious, but I need to establish some street cred here.

It’s interesting to consider the case, or rather the error-rate as presented by Paul Ehrlich in his Population Bomb apocalyptic fantasy of global starvation, in terms of his projections and failures. As noted in a recent Instapundit post/review, ““Ehrlich’s predicted holocaust, which assumed that the 1960s global baby boom would continue until the world faced mass famine, didn’t happen. Instead, the global growth rate dropped from 2 percent in the mid-1960s to roughly half that today, with many countries no longer producing enough babies to avoid falling populations. Having too many people on the planet is no longer demographers’ chief worry; now, having too few is.” Why is Paul Ehrlich considered anything but a foolish crank, now that his thesis has been proven wrong? He should only be known for his work as a butterfly chaser.

IMO one of the greatest examples of confirmation bias is demonstrated by Marxists as they plug everyone into the social hierarchies of their 19th Century game.