The Neuroscience Of (Climate) Catastrophism

Can we explain the widespread fondness in climate catastrophism as the unfortunate by-product of the mechanisms of our brains and societies? Apparently, we can: as the consequence of judgment extremism, simplistic dichotomy, and confirmation bias.

[Of course I am and always will be totally against the many attempts at pop-psychology, done with the aim of finding out why people don’t just sheepishly follow everything they’re told, about climate change catastrophes. But what I am going to mention below, is actually a solidly scientific point of view]

In IHT’s printed edition’s (Jul 31, 2010) “Greece and the Power of Negative Thinking” where Thierry Malleret and Olivier Oullier express some thoughts about the recent Euro economic crisis about Greece, that could be applied almost as-is to AGW catastrophism as well:

there are lessons to be learned from neuroscience on what distorts our cognitive abilities […]

For a start, “judgment extremism” pays dividends. The neural system used when anticipating rewards is active long before the one in charge of evaluating risks and losses. Most academics and opinion-makers know that the rate of return on postulating extreme outcomes is far greater than that of simply establishing facts: A columnist is much better off predicting a dire outcome than being caught up with the facts that lead to a complex and uncertain one.

Therefore, an outlandish prediction (albeit, perhaps, inadequately grounded) […] is likely to be rewarded by editorial success and intellectual kudos; and by the time it may be proven wrong, it might well go unnoticed.

A narrow framing of choices is another problem. […] Although our brains do not function biologically in a dichotomous fashion, binary thinking is the brain’s favored method, as it is easier to categorize events in terms of success/failure, cooperation/competition, rational/irrational, etc.

This is why, regardless of the context, a careful “it depends” explanation will never be as convincing as a clear-cut opinion. Accordingly, we tend not to paint a nuanced picture of a complex reality, but to stick to the version constituting extreme outcomes. This leaves very little room for analyzing the nuts and bolts of the [response] process, with its inevitable successes and failures.

Then there is another bias: We tend to form opinions by falling back on intuitions or hunches and then look for confirmatory evidence to reinforce these. When someone confirms our views, they are reinforced in our brain’s reward /system. Hence, we seek further evidence to validate those views, shutting off contradictory information.

This is not exactly exciting news. If true, the above means we will always have to deal with some form of climate catastrophism. On the other hand, if the above is true it will also be another defense item against the onlsaught of “it’s worse than we thought” newspieces.