Coffee & stroke: what the study didn’t prove and what some stories got wrong

It was a big study, but an observational study. Not a trial. Not an experiment. And, as we say so many times on this site you could almost join along with the chorus – observational studies have inherent limitations that should always be mentioned in stories. They can’t prove cause and effect. They can show a strong statistical association, but they can’t prove cause and effect. So you can’t prove benefit or risk reduction. And stories should say that.

“The problem with this type of study is that there are too many factors unaccounted for and association does not prove causality, said Dr. Larry B. Goldstein, director of the Duke Stroke Center at Duke University Medical Center.

“Subjects were asked about their past coffee consumption in a questionnaire and then followed over time. There is no way to know if they changed their behavior,” Goldstein said.

And, he noted, there was no control for medication use or other potential but unmeasured factors.

“The study is restricted to a Scandinavian population, and it is not clear, even if there is a relationship, that it would be present in more diverse populations. I think that it can be concluded, at least in this population, that there was not an increased risk of stroke among coffee drinkers,” he said.”

When you don’t explain the limitations of observational studies – and/or when you imply that cause-and-effect has been established, you lose credibility with some readers. And you should. Note some of the comments left on the USA Today website:

• “Within a few weeks a new ‘study’ will come out telling us how bad coffee is for us.”

• “Sign…I wish someone would make up their minds! Wasn’t it just a week or so ago there was a study about smog, coffee, etc., being bad for ya?”

• “Remember when “scientific” studies were considered trustworthy and reliable?? How can anyone tell the few pearls of knowledge in a world of pointless studies that flip-flop results and rehash incessantly??”

” Ladies, you knew there was a good reason for that double mochachino you have every morning and maybe that one at lunch too.”

At least they came back later and explained:

“As for your mochachino, no word yet on the benefits of whipped cream and chocolate sauce.”

But why even go there to begin with?

ABCNews.com, by comparison, emphasized this study showed “association, not causation.” Kudos to them.

For anyone – journalist or consumer – or researcher, for that matter – who doesn’t grasp the importance of using the correct language to describe observational studies, please see our primer on this topic.

Comments

Thanks for posting this. It’s refreshing to read.
It’s also worth noting that journalists are not entirely to blame. There is a disturbing cognitive slip that happens when this data is publicized. Take, for example, this quote from the National Cancer Institute:
“There have been no controlled clinical trials on the effect of regular physical activity on the risk of developing cancer. However, observational studies have examined the possible association between physical activity and a lower risk of developing colon or breast…”
Most journalists, and publicists, looking to gain social or economic capital for some cause, would blow this entire section completely out of proportion.

Our Reviewers

Health News Watchdog blog

Story Reviews

This short piece focuses on the story of one patient’s (thus far) successful treatment for a particularly aggressive cancer. It does a nice job with this limited material, but taking a broader look at the evidence would’ve yielded a stronger, more useful story.

The claims made by one of the authors of a very small, short-term study — that vitamin D “made the cancer better” — deserved a more thorough analysis than this story provided. But the bottom line message was accurate.