1) Tout “potential.”

We see this word a lot: “potential.” In medical science, it’s pretty meaningless. I can safely say that my beloved mutt has potential to star in a movie some day. (After all, she looks exactly like that poopy dog in “Roma.”) Unfortunately, it’s unlikely that she will.

Claims are especially tenuous when they come from animal studies. A subhead in this release says this “new mechanism could be a ‘huge leap’ for immunology,” with no mention until the 13th paragraph that this was only tested in mice.

2) Skip the data.

This release doesn’t give any data to support the idea that compression socks will help you (or your kid) perform better on the soccer field. In fact, the study involved testing just 20 players during a single soccer match and measuring whether the 10 who wore compression socks performed better on agility, heel-rise, and endurance tests after the game.

A researcher is quoted saying the study “confirmed that there is a protective effect with compression stockings that may be crucial for performance in soccer matches.” But where are the numbers to show the scope of this purported benefit?

3) Pose a question.

As we’ve written, question-mark headlines can be used to skirt accountability for reporting an unproven claim.

This study showed that Facebook was a “good mechanism” to convey messages about cancer screening. A researcher states: “Engagement was high with Facebook ads, and those who viewed ads clicked through to the sign-up page, an indication of intent to enroll.”

In the last paragraph we learn: “Technically, the answer to whether Facebook advertising can prevent cancer remains unanswered – it’s impossible to tell how many patients who otherwise would have developed cancer were caught early due to the group’s text-based information program.”

4) “Suggest” something.

The word “suggest” is a handy way to obscure that a claim isn’t proven. As we learn in the second-to-last paragraph, the drug — metformin — appeared to have an effect on mice with heart failure symptoms, not humans.

Unmentioned is that previous research has similarly indicated metformin may help people with heart failure, but it’s an idea that has yet to be shown conclusively in a randomized clinical trial.

5) Use “may.”

We’ve often advised readers that whenever they see “may” in a claim about a study result, feel free to substitute “may not.” Chances are good that you’d end up being right.

I don’t want to be too hard on this news release, which can be credited with explaining how this randomized trial at Duke University was conducted and emphasizing that results need to be interpreted cautiously due to the small number of participants.

But in using the word “may” no fewer than five times, it obscures the point — made near the end of the release — that bigger studies involving multiple trial sites are needed to prove that exercise sustainably improves executive function in adults at risk for cognitive decline.

Exaggeration is common

You might be tempted to think that yesterday’s batch of incautious releases — four out of five of which were issued by a university — was an anomaly, and that academic institutions usually take utmost care in communicating medical research to the public.

But check out a 2009 paper by Drs. Steve Woloshin and Lisa Schwartz of the The Dartmouth Institute, which scrutinized 200 news releases issued by academic medical centers.

They found that news releases from academic institutions often exaggerated findings:

Only 17% of releases on human research promoted studies with the strongest designs (randomized trials or meta-analyses).

Forty percent reported on the most limited human studies—those with uncontrolled interventions, small samples of less than 30 participants, surrogate outcomes, or unpublished data — with 58% lacking relevant cautions.