I recently discovered this interesting paper by Yale Law Professor Dan Kahan: What is the “science of science communication”? He introduces the concept of the science communication paradox: “Never have human societies known so much about mitigating the dangers they faced but agreed so little about what they collectively know.” This figure demonstrates the science communication paradox, showing strong disagreements about different risks for people with different political beliefs:

According to Kahan, resolving the science communication paradox is the main goal of the new science of science communication. He lays out two potential explanations for the science communication paradox:

Public Irrationality Thesis (PIT): Advocates of this position believe that the public, on the whole, is not scientifically literate and does not think about risk the way scientists do.

If PIT is correct, then as people acquire more scientific literacy, their views on different risks should align more with scientists’ views. This is not what actually happens:

With increasing science comprehension scores, people actually become more polarized along party lines in their belief about the risk of climate change (shown in more detail in an earlier publication by the same group). Scientific literacy does not necessarily mean views aligned with scientists’.

Cultural Cognition Thesis (CCT), alternative explanation for the science communication paradox: Suggests that our group identities are fundamental shapers of how we think about risk. Kahan gives the analogy of sports fans for two opposing teams: The opposing fans are likely to actually see a replay of a questionable call differently and in favor of their team. Along these lines, when people feel their group’s stance is being threatened, they’re more likely to see evidence as confirming their belief or to discount the source if it contradicts what they want to believe. This account is much more consistent with the real-world data (for example, the previous graph) and experimental work he and collaborators have done.

It’s important to note that risk assessments of most science issues do not demonstrate the science communication paradox: people with more science knowledge don’t necessarily become more polarized in their beliefs. Here are a few examples of issues that science intelligence predicts people’s risk assessments better than their political leaning:

How can we use awareness of the cultural cognition thesis to improve science communication?

Kahan suggests the disentanglement principle: If polarization results from a clash between people’s identities as members of a cultural group and scientific facts they’re encountering, we should work to separate these two. For example, a true/false question might state: “Human beings, as we know them today, developed from earlier species of animals.” Someone who belongs to a religious group that doesn’t support evolution has to choose between answering this question in a way that’s consistent with scientific consensus OR their group identity. But rewording the statement to something like: “According to the theory of evolution, human beings, as we know them today, developed from earlier species of animals.” takes away the conflict. The respond can now demonstrate scientific knowledge by agreeing with the statement without jeopardizing their identity of part of a group that doesn’t believe in evolution.

While the original statement has led to polarized views (as science knowledge increases, religious and non-religious people’s responses begin to diverge more), the second framing has shown converging responses (scientific knowledge, rather than religious beliefs, now becomes the best predictor of correct responses).

In a great blog post for Southern Fried Science, Andrew Thaler shares that he talks about a lot of things when he talks about climate change, but science isn’t one of them. He talks about fishing, flooding, farming, faith, and the future. These are things that his audiences know deeply, and climate change is relevant to anyone with interest in any of his f’s. He provides a great example of disentangling people’s identities from the scientific issue, and instead actually uses their identities to show the issue’s relevance.

The disentanglement principle offers one way that the new science of science communication begins to reduce the paradox of science communication, but it’s just one drop in a huge pond of paradox. We have to keep working on ways to communicate information that conflicts with people’s cultural identities, and as this work shows, jamming information down people’s throats isn’t the way to close belief gaps.