Informed Decisions: Sense About Science and Helping People Understand Evidence

“Are heavy metals in lipsticks linked to cancer?” “Is there a diet that can treat epilepsy?” “Can goats’ blood help beat MS?”

With so many vivid, competing, scientific-sounding claims—often presented without context or background by the mass media—what is a layperson who wants to live healthfully to do? And how do we in the science-technical-medicine (STM) community help people to separate good science from pseudoscience? How do we help media professionals in their roles as communication middlemen? Well, Leonor Sierra, US coordinator of the charitable trust Sense About Science (SAS) (www.senseaboutscience.org) and press officer at the University of Rochester, would like people first to look for the scientific community’s “Good Housekeeping Seal of Approval”, the phrase peer review.

False, sensationalistic pseudoscientific claims not only can harm public health but can cost desperate people a great deal of money and offer false hope. SAS’s mission is to work with scientists, media professionals, and the public to equip people to make sense of science and evidence. One recent successful SAS initiative is the pamphlet “I Don’t Know What to Believe”, a short educational guide to the peer-review concept. The phrase peer review in an article or television spot lets a reader or a viewer know that the science has been evaluated by experts in the field and has held up to their scrutiny. The phrase affirms for the layperson that there is evidence to support the scientific assertions being made.

The “I Don’t Know What to Believe” guide has been a successful initiative for SAS: more than 500,000 have been distributed in the United Kingdom, and a US version was launched this past winter. Sierra reports, however, that there was surprising pushback from some of the scientists who were consulted about the guide. They were worried that the curtain was being drawn back on their work (“Why does the public need to know?”), and some confused the initiative with the push for open access. But people were hungry for a tool to help them know what to believe: Among the people and agencies that wanted copies of the guide were members of the news media and representatives of nongovernment organizations, medical charities, and libraries—even politicians!

It turns out that you can change the discussion. Before the guide came out, news articles in the UK would most often not mention the journal in which a study had been published, but now it is the reverse: most articles do cite the journal, and this gives the reader further evidence that the information in the article is worthy. In fact, in 2011, the BBC conducted a review of its processes for disseminating scientific information and determined that a report will include the name of the source journal whenever appropriate.

Another important SAS initiative is the “Ask For Evidence” campaign, in which laypersons and scientists contact companies that are making dubious claims about their products (for example, “This conditioner restores hair at a cellular level!”) to ask for evidence for the claims. It turns out that many companies were shocked to be asked for evidence to support such claims—no one had ever done it before. In several cases, the companies wound up abandoning the claims.

We all benefit from a world in which companies do not profit from unproven claims, in which the mass media disseminate scientific information responsibly, and in which laypersons have the right tools to differentiate often-dangerous pseudoscience from valid research—in short, a world in which we make sense about science.