In 2011, Petroc Sumner of Cardiff University and his colleagues published a brain imaging study with a provocative result: Healthy men who have low levels of a certain chemical in a specific area of their brains tend to get high scores on tests of impulsivity.

When the paper came out, thousands of people across England were rioting because a policeman had shot a young black man. “We never saw the connection, but of course the press immediately saw the connection,” Sumner recalls. Brain chemical lack ‘spurs rioting’, blared one headline. Rioters have ‘lower levels’ of brain chemical that keeps impulsive behaviour under control, said another.

“At the time, like most scientists, we kind of instinctively blamed the journalists for this,” Sumner says. His team called out these (shameful, really) exaggerations in The Guardian, and started engaging in debates about science and the media. “We quickly began to realize that everyone was arguing on the basis of anecdote and personal experience, but not evidence. So we decided to back off, stop arguing, and start collecting data.”

And the data, published today in BMJ, surprised Sumner. His team found that more than one-third of academic press releases contain exaggerated claims. What’s more, when a study is accompanied by an exaggerated press release, it’s more likely to be hyped in the press.

Because press releases are almost always approved by a study’s leaders before being distributed, Sumner’s findings suggest that scientists and their institutions play a bigger role in media hype than they might like to acknowledge.

“We’re all under pressure as scientists to have our work exposed,” Sumner says. “Certainly I think a lot of us would be quite happy not to take responsibility for that — just to say, ‘Well, we can’t do anything about it, if they’re going to misinterpret that’s up to them but it’s not our fault’. And I guess we’d like to say, it is really important and we have to do something more about it.”

Sumner and his colleagues looked at 462 health or medicine-related press releases about issued by 20 British universities in 2011. For each press release, the researchers also analyzed the scientific study it was based on, and news articles that described the same findings.

The researchers limited the analysis to health and medicine partly because (as I’ve written about before) these stories tend to influence people’s behavior more than, say, stories about dinosaurs or space. They focused on three specific ways that press releases can distort or exaggerate: by implying that a study in animals is applicable to people; by making causal claims from observational data; and by advising readers to change their behaviors (“these results suggest that aspirin is safe and effective for children,” say, or, “it’s dangerous to drink caffeine during pregnancy”).

More than one-third of the press releases did each of these things, and the misinformation showed up in the media, too. For example, among press releases that gave exaggerated health advice, 58 percent of subsequent news articles also contained exaggerated health advice. In contrast, among press releases that didn’t make exaggerated recommendations, only 17 percent of news articles did so. The researchers found similar trends for causal claims and for inferring that animal work applies to people.

“We certainly don’t want to be blaming press officers for this,” Sumner says. “They’re part of the system. The academics probably don’t engage as much as they should.”

I called Matt Shipman, a science writer and press information officer at North Carolina State University, to ask what he thought of the findings. Shipman has been a press officer for seven years, and before that he was a journalist. “The numbers are very powerful,” he said, and they underscore the importance of press releases at a time when reporters often don’t have the time or resources for thorough reporting. (Shipman has just signed on with Health News Review to rigorously evaluate the quality of health-related press releases.)

Shipman also brought up an important caveat. Because this study is observational, it doesn’t prove that press releases are themselves the cause of hype. “If a researcher is prone to exaggeration, which leads to exaggerated claims in a news release, the researcher is likely to also be prone to exaggeration when conducting interviews with reporters,” Shipman says. “The news release may be a symptom of the problem, rather than the problem itself.”

When he writes press releases, Shipman says he almost always begins by meeting with the researcher in person and asking him or her to explain not only the findings, but what work led to them, why they’re interesting, and what other experiments they might lead to. Then Shipman writes a draft of the release and sends it back to the researcher for approval. He asks the scientist to check not only for factual inaccuracies, but for problems in emphasis, context, or tone. Different press officers at other institutions, however, write press releases using far less rigorous methods, as I have learned by swapping stories with them over the years. And some press officers are judged by the quantity of stories that come out in big outlets, which naturally creates an incentive to make research seems newsworthy, even when it might not be.

“What I think is probably the case is that all of the variables at play here — the researchers, the press officers, and the journalists — are all humans,” Shipman says. “And all of them are capable of making mistakes, intentionally or unintentionally.”

So. Is there any concrete way to reduce those mistakes?

In an editorial accompanying the BMJ study, author and doctor Ben Goldacre makes two suggestions. First, the authors of press releases and the researchers who approved them should put their names on the releases, he writes. “This would create professional reputational consequences for misrepresenting scientific findings in a press release, which would parallel the risks around misrepresenting science in an academic paper.” That seems reasonable to me.

Second, to boost transparency, press releases shouldn’t only be sent to a closed group of journalists, Goldacre writes. “Instead, press releases should be treated as a part of the scientific publication, linked to the paper, referenced directly from the academic paper being promoted, and presented through existing infrastructure as online data appendices, in full view of peers.”

That sounds good, but “would require a significant shift in the culture,” according to Shipman. Press officers would have to be brought into the process much earlier than they are now, he says. And scientists would have to be far more invested in press releases than many of them are now.

I think we journalists need to own our portion of the blame in this mess, too. Let’s go back to Sumner’s 2011 brain-imaging study, for example. His university’s press release didn’t have any wild exaggerations, and it certainly didn’t make a connection between the research and the riots. That came from the journalists (and/or their editors).

“But that actually doesn’t happen very often, it turns out,” Sumner says. “Most of the time, the media stories stay pretty close to what’s in the press release.”

Which isn’t exactly great news, either.

Related

17 thoughts on “The Power of a Press Release”

“His team found that more than one-third of academic press releases contain exaggerated claims. What’s more, when a study is accompanied by an exaggerated press release, it’s more likely to be hyped in the press.”

I see this all the time, subscribing as I do to services that themselves use press releases for feed. And press judgment of what the public will find interesting further distorts things. One egregious recent example is describing the Rosetta comet mission as if it were primarily about the origins of life.

I think this problem will be difficult to fix. Science is social, and so scientific institutions and projects are social entities. The direction of such entities is inevitably political. Therefore, the people who rise to the top of them are not those with the most knowledge, insight, or intelligence, but those with the best political skills. A part of political skill is recognizing that political success requires at least the appearance of success in whatever one is ostensibly doing. Therefore, leaders in science as in every other field are strongly motivated to produce, if not success, at least the appearance of success. Hyped press releases are part of the appearance of success, therefore, press releases will be hyped. Occasionally, they may approximate reality, if reality is agreeable, but it does not appear to be a strict requirement.

Ginny, thank you for the coda. I’ve already seen one report saying that journalists are therefore not “to blame” for exaggeration in news. Here’s a response to that, which I tweeted earlier:

If exaggerations or inaccuracies end up in science/health reporting, then the journalist should always take 100% of the blame, even if the errors originated with scientists or press releases (they can get blamed too; it’s not a zero-sum game). Errors can arise anywhere; they are meant to end with us. We are meant to be bullshit filters. That is our job.

It can be a hard job, with many systemic factors—editorial demands, time pressures, lack of expertise—that stop us from doing it properly. These are reasons for empathy, but they change nothing. If we publish misleading information, and try to apportion blame to our sources, we implicitly admit that we are mere stenographers—and thus, useless. If we claim to matter, we must take the blame.

Virginia Hughes, an acclaimed journalist and medical writer for such well respected publications as “National Geographic” and “Nature” cited evidence that medical press releases are not (always) to be believed. Hughes, who admits to being “quirky,” sought comment from various university press officers who attempted to distance themselves from press released that they acknowledge are “misleading.”

etc.

All this does, Virginia, is to prove how right is my personal motto: “Never believe anything that is not independently verified.”

In round numbers; 58% of the news stories from the one third of press releases with exaggerated claims contained exagerrated claims, close enough to 20%. 17% of the two thirds of press releases without exaggerated claims were reported with exaggerated claims, that’s 13%.

Is the importance of 20% vs 13% being exaggerated here ?

Actually, given that the researchers, press officers and journalists are all under pressure to get the story out there it looks as if the system is working better than might be expected. That’s not to say that it could not be better, but when the Daily Express for instance routinely headlines health stories on its from page there is always going to be a substantial spin on health news, no matter what comes from the researchers.

I genuinely think that scientists should shoulder the responsibility when it comes to inaccuracies in science reporting. We talk a lot about drives towards Open Access, and making research more accessible. That’s not much use if it’s full of hype or misrepresentation (deliberate or otherwise). It strikes me that scientists might seem to have an easy way out here in passing the buck to POs or journalists. I see where Ed is coming from in saying that journalists should be acting as BS filters, but that doesn’t absolve scientists from any responsibility to not produce BS in the first place. I guess the takehome really (obviously) is that everyone, at all points in the process, needs to do better.

Peter – there are things they can do, sure. Make sure the press releases are as unambiguous as possible (Chris Chambers came up with a great idea for this a while back – a ‘gist’ section in PRs that tells you in a couple of bullet points what the study does, and does not, show). And if there’s still exaggeration in the final news stories, complain more.

Pete – this is going to sound like a cop out, but I think that we have to take seriously the idea that scientists should spend their time doing science and not running around fact-checking the tabloid press.

Complaining only makes a difference if somebody cares, and there is precious little evidence that the tabloids do.

Excellent piece. One additional dimension that is not brought up here is the role that scientists (either the original researchers themselves, if their work has been misrepresented by others, or other experts not involved in the study) can play in calling out hyped claims. In particular, via social media (blogs, Twitter, Facebook, etc., can all be effective). Being called out for hype should, one hopes, dampen the willingness of all parties involved to contribute to the hype. Also, I think this finding and Ed’s comments make a strong case for having *science writers* covering science, not just general-purpose reporters who may not have the expertise necessary to recognize and correct hyped claims from press officers or researchers. Finally, and Virginia and Ed are both very good about this, it can help to contact other experts besides the authors who may have a more skeptical or realistic take on the implications of the findings.

If an academic press release claims that this or that observation, measurement, discovery, etc., is the “first” of its kind, it’s highly likely to be wrong. In planetary science, for an extreme example, the frequency of claims of the discovery of water on Mars is the subject of many in-jokes.

+1 to requiring that scientists and press officers put their name on their press releases. Pretty much great all-around. 1) Identifies bad actors, 2) encourages scientists to be more cautious in their claims, 3) including credit acknowledges that this sort of thing is outreach, allowing scientists to tick a CV box, and 4) will probably result in a smaller number of (signed) press releases, since the back-and-forth required for an academic to put their name on something adds a lot of overhead.

About Virginia

I'm a freelance journalist who writes about neuroscience, genetics, behavior, and medicine for the likes of Nature, Popular Science, and Slate. Before coming to Phenomena, I contributed to the delightfully quirky science blog The Last Word on Nothing. I live in Brooklyn, New York, land of artisanal basketball stadiums and rich dog walkers, with my husband.

Follow me on Twitter

Virginia Elsewhere

Uprooted, a longform narrative published in MATTER, investigates the rapidly growing industry of genetic genealogy. Deep down, everybody wants to know who they are and where they came from. But what happens if you discover that everything you thought was true was, in fact, a lie?

Re-Awakenings, a story about a strange sleeping sickness published on The Last Word on Nothing, was deemed best post of 2012 by Science Seeker.