Monday, February 11, 2013

Chemophobia and radiophobia's shared cultural roots

Chemist Michelle Francl has a quite interesting critique of the all-too-common "bobo" conceit of chemophobia - in its distilled form, a variant of the "appeal to nature" fallacy (sometimes mistakenly referred to as the "naturalistic fallacy"). Faced with the unfamiliar (and sometimes unpronounceable) chemical names for even mundane medicines like naproxen(more commonly known under its trade name "Aleve"), they sound menacing and unfamiliar - despite the fact that naproxen is fundamentally similar to salicylic acid on a chemical level - an extract of willow bark (and a metabolite of the more common compound known by the humble name of "aspirin"). As a result, a booming cottage industry exists for peddlers of "natural" remedies - with the implication that "natural" means "safer." (In the end, chemistry doesn't really care whether it comes from a lab or nature - the laws of physics remain the same regardless).

An important consequence of chemophobia is that it trades established science for an unknown - "traditional" alternative remedies which may or may not be effective and which fall outside the rigorous quality, safety, and perhaps most important dose controls applied to pharmaceuticals. But the one thing that it doesn't escape from is chemistry itself - for these "natural" remedies to be effective, they rely upon the same chemical principles in modern medicines. Hence, the chemophobia fallacy.

Francl's discussion of chemophobia (of which the whole thing is very much worth reading) touches upon an important parallel common to discussions of nuclear technology - radiophobia:

We are a chemophobic culture. Chemical has become a synonym for something artificial, adulterated, hazardous, or toxic. Chemicals are bad—for you, for your children, for the environment. But whatever chemophobics would like to think, there is no avoiding chemicals, no way to create chemical-free zones. Absolutely everything is made of atoms and molecules; it’s all chemistry.

The problems of chemophobia and radiophobia share common cultural cognition roots - particularly a mistrust arising from the origin of perceived risks. Sources of risk from large, faceless corporations, from synthetic origins are unfamiliar or ill-understood conspire to increase perceived risk, particularly for more those who identify with more egalitarian / communitarian values. (By contrast, as Francl notes, more natural, friendly-sounding names - think extract of willow - by virtue of their familiarity sound less threatening - again, despite the fact that the chemistry is unchanged.)

The same is true for things like radon gas - despite frequent concerns about radiation exposure from nuclear power plants, individuals are far less likely to be concerned about far more common (and far larger) exposures from naturally-occurring radon gas in their own basements resulting from the decay of uranium byproducts found in uranium-rich soils. More importantly, the largest source of increasing radiation exposure in modern times has been due to medical imaging - the increasing proliferation of regular CT scans is far higher contributor to the average American's radiation exposure compared to nuclear energy facilities. (ANS has a very useful tool for estimating your average exposure based on these kinds of lifestyle factors.)

Francl concludes with an astute observation of a common goal, which should not itself be to dictate choices concerning how risks of various alternatives (both in medicine and energy) are weighed, but to ensure that those risks are discussed and decided based upon sound information about those risks, rather than falling prey to cultural biases. Chemistry, like energy, is an indispensable component of the modern world, and both face the same communication challenge of fostering a science-based public discussion based on a rational evaluation of comparative risks and benefits. This in turn means understanding how cultural biases are formed - and ultimately developing science communication strategies to break through these biases. (In this regard, I once again point to folks like +Dan Kahan of the Yale Cultural Cognition project, who is doing the yeoman's work with this topic as well as David Ropeik, who writes about risk perception issues extensively in an approachable fashion.)

I will say as a postscript that I think all too often that there is a particular intellectual laziness which resists these kinds of strategies - which (at least in the case of nuclear energy) solely blames public (mis-)perception of risks on petroleum-fueled conspiracies. Aside from being devoid of evidence whenever challenged, such excuses (and they are excuses) unproductively halt any discussion of what to do about the issue (and further, fail to recognize the deeper cognitive science behind how risk perceptions are formed. For example, cultural polarization occurs even for novel technologies like nanotechnology after exposure to limited information about the technology.)

Obviously, there exists no doubt that there's money to be made preying on fear (both in the realm of alternative "energy" and "medicine"), but the fact is that these is more than sufficient evidence that how risk perceptions are formed has far more to do with our inherent values and cultural affiliations than the stock price of Exxon-Mobil. Thus, for these kinds of science-informed discussions to occur, it also requires nuclear professionals to stop hiding behind convenient, ephemeral excuses of conspiracies and actually begin understandingthe science of how risks are perceived (and likewise, the science of risk communication).