Enter your email to subscribe:

A few months ago, Dan Kahan, Donald Braman, and Hank Jenkins-Smith published an important article titled, Cultural Cognition of Scientific Consensus (available here).The article looked at several risks, two of which are quite relevant to today’s news and environmental law:the risks associated with climate change and those associated with storing nuclear waste.

While I do not want to rehash all of the article’s findings, consider one of the article’s major conclusions:

When mechanisms of cultural cognition figure in her reasoning, a person processes information in a manner that is equivalent to one who is assigning new information probative weight based on its consistency with her prior estimatio. Because of identity protective cognition (Sherman and Cohen 2006; Kahan et al. 2007) and affect (Peters, Burraston, and Mertz 2004), such a person is highly likely to start with a risk perception that is associated with her cultural values. She might resolve to evaluate the strength of contrary evidence without reference to her prior beliefs. However, because of culturally biased information search and culturally biased assimilation (Kahan et al. 2009), she is likely to attend to the information in a way that reinforces her prior beliefs and affective orientation (Jenkins-Smith 2001).When mechanisms of cultural cognition figure in her reasoning, a person processes information in a manner that is equivalent to one who is assigning new information its probative weight based on its consistency with her prior estimation. Because of identity protective cognition and affect, such a person is highly likely to start with a risk perception that is associated with her cultural values. She might resolve to evaluate the strength of contrary evidence without reference to her prior beliefs. However, because of culturally biased information search and culturally biased assimilation, she is likely to attend to the information in a way that reinforces her prior beliefs and affective orientation.

In other words, one’s worldview alters risk perception in ways we would never anticipate.Our minds seek out and give greater weight to information that harmonizes with our worldviews.Interestingly and disturbingly, even when risks are very complex and difficult for non-experts to assess, non-expert brains often trust one's worldview even if it means disagreeing with the experts.

While the article presents a very convincing case, a few items in the news this past week seemed to reinforce the article’s findings for me.(Granted, the article has left me uneasily wondering if I am just reinforcing my own priors.)

First, a post on a political science blog called Monkey Cage (discussed in an excellent post by Dan Farber earlier this week) suggests that as education increases, liberals are more likely to see climate change as caused by human activity whereas, surprisingly, as education increases, conservatives are less likely to see climate change as caused by human activity.

Both the Monkey Cage and Farber’s post provide the following helpful graph to illustrate this finding:

Second, CBS News recently released a poll showing, among other things, support for nuclear power in decline. This is not too surprising given the recent failing of the Fukushima Daiichi Nuclear Power Station.Interestingly however, CBS’s data show that Democrats are more worried than Independents and much more worried than Republicans about risks associated with nuclear power.

It is a bit humbling to recognize that in addressing major problems like climate change even our perceptions of risks can be divisive.