Skeptics discount science by casting doubts on scientist expertise

Most surveys of the US public indicate that scientists are popular, trusted figures. The same, however, cannot be said about some of their conclusions, as topics like climate change and evolution remain controversial with many segments of the population. A recent Pew survey gives an indication of why: even though the scientific community's opinion is largely unified on these topics, the public thinks that there is significant dispute among the researchers. A study published by the Journal of Risk Research attempts to explain why this might be the case.

The people behind the new study start by asking a pretty obvious question: "Why do members of the public disagree—sharply and persistently—about facts on which expert scientists largely agree?" (Elsewhere, they refer to the "intense political contestation over empirical issues on which technical experts largely agree.") In this regard, the numbers from the Pew survey are pretty informative. Ninety-seven percent of the members of the American Association for the Advancement of Science accept the evidence for evolution, but at least 40 percent of the public thinks that major differences remain in scientific opinion on this topic. Clearly, the scientific community isn't succeeding in making the public aware of its opinion.

According to the new study, this isn't necessarily the fault of the scientists, though. The authors favor a model, called the cultural cognition of risk, which "refers to the tendency of individuals to form risk perceptions that are congenial to their values." This wouldn't apply directly to evolution, but would to climate change: if your cultural values make you less likely to accept the policy implications of our current scientific understanding, then you'll be less likely to accept the science.

But, as the authors note, opponents of a scientific consensus often try to claim to be opposing it on scientific, rather than cultural grounds. "Public debates rarely feature open resistance to science," they note, "the parties to such disputes are much more likely to advance diametrically opposed claims about what the scientific evidence really shows." To get there, those doing the arguing must ultimately be selective about what evidence and experts they accept—they listen to, and remember, those who tell them what they want to hear. "The cultural cognition thesis predicts that individuals will more readily recall instances of experts taking the position that is consistent with their cultural predisposition than ones taking positions inconsistent with it," the paper suggests.

They're not the first to reach this conclusion—they cite extensive literature, by both themselves and others, indicating it's in operation—but they do a good job of attempting to demonstrate it in action. To do that, they focus on three issues that tend to break down on political/cultural lines, but have been addressed on an empirical level by the National Academies of Science. They break the cultural groups into two camps: individualistic and hierarchical, which could loosely be described as pro-business and anti-regulation, and egalitarians, who are more likely to favor government intervention.

The scientific issues that were examined are:

Climate change, and whether it's anthropogenic. The NAS says yes on both counts, and egalitarians are expected to accept it, while individualistic individuals are likely to reject it due to its regulatory implications.

The safe handling and storage of nuclear wastes. Again, the NAS says it's possible, and the authors expect that egalitarians will oppose it on environmental grounds.

The safety implications of concealed carry laws. Individualists would be expected to favor these on individual liberty grounds, while egalitarians would oppose them. In this case, they'd both be wrong; the NAS has concluded that there's insufficient evidence to tell.

They looked at the public's take on these issues via a survey of 1,500 US adults, conducted in July of last year. A series of questions broke down where each participant fell on the individualistic/hierarchical and egalitarian/communitarian scales. Then, they were give a series of statements about the scientific issues above, i.e., "Global temperatures are increasing," and asked whether expert scientists agreed with the statement.

As expected, the participants that were more accepting of government regulations overwhelmingly accepted that global temperatures are increasing (78 percent) and that humans were likely the cause (68 percent). In contrast, a small majority (56 and 55 percent) of the pro-business participants felt that the experts were divided on these topics; a significant faction actually felt that most experts disagree with the consensus position.

A similar breakdown came with concealed weapons laws. For individualists, half thought the experts agreed that they were good for public safety, and 40 percent thought the experts were divided. Roughly the same number of egalitarians agreed that the experts were divided, but nearly half thought the exact opposite of the individualists, namely that experts thought that the experts disagreed that they decreased violent crime.

Oddly, a statement on the safe storage and disposal of nuclear waste left the audience very divided, with the same percentage of both groups (45 percent) thinking that the experts were divided. The expected differences did show up in the agree/disagree categories, but were far less pronounced.

Explaining the discrepancy

So, how do these differences arise? To get at that, the authors created fictitious biographies for expert scientists, portraying them as members of the NAS. They then provided each with a fictitious book excerpt that placed them at one or two of the extremes of opinion (again, for climate change and nuclear waste, one extreme represents the consensus; for concealed carry laws, both extremes are wrong).

Here, the differences became even more apparent. Egalitarians accepted the expertise of someone suggesting climate change was high risk at a rate of 88 percent; the low-risk author was rated an expert a bit less than half the time. For individualists, in contrast, the low-risk author was considered an expert 86 percent of the time; the high-risk author was rated an expert by less than a quarter of participants. A nearly identical pattern prevailed for concealed-gun laws, while nuclear waste expertise was rated as mixed.

So, it's not just a matter of the public not understanding the expert opinions of places like the National Academies of science; they simply discount the expertise associated with any opinion they'd rather not hear.

The authors suggest that this information can be useful for scientific organizations that are interested in reaching a broader audience; by phrasing matters in a way that would appeal to both individualists and egalitarians, and providing experts that can speak in the appropriate terms, they suggest, different cultures can be reached with the same scientific information. In short, they favor a framing approach.

We're not entirely certain this would work. In the case of some of these issues, there are organizations (pro-coal and pro-gun groups, for example), which can provide people who are just as adept at speaking in these terms, and are therefore likely to be accepted as experts. In short, this seems like a recipe for convincing the public that the scientific community is divided, even when it's not. Figuring out how to overcome the public's tendencies to accept only what it wants to hear (assuming that's possible) seems to be the only way to avoid this situation.