Pages

6.11.13

29 January 2011

Why nearly everything Newsweek writes about medicine is wrong

Newsweekhas an article about Dr John Ioannidis and his work discrediting many medical studies. Incidentally The Atlanticwrote about him and his work back in November as didThe New Yorker in December. All of these articles highlight Ioannidis' findings that many studies are flawed, but take different approaches, and draw different conclusions from his work. Newsweek's take is by far the worst and most harmful.
In any case Newsweek informs us:

In just the last two months, two pillars of preventive medicine fell. A major study concluded there’s no good evidence that statins (drugs like Lipitor and Crestor) help people with no history of heart disease. The study, by the Cochrane Collaboration, a global consortium of biomedical experts, was based on an evaluation of 14 individual trials with 34,272 patients.

Combining this "fact" with Ioannidis' findings they go on to boldly assert: "Even a cursory glance at medical journals shows that once heralded studies keep falling by the wayside."
Lets take a look at their statin example since it characterizes the rest of the piece nicely. Newsweek tells us what the Cochrane Collaboration is, yet they omit that it does not conduct clinical studies. They conduct reviews, often in the form of meta-analyses. There's a qualitative difference between a meta-analysis and a study; studies examine real patients, meta-analyses examine data. Here is Dr Mark Crislip's explanation of why this difference matters:

The studies included in a meta-analysis are often of suboptimal quality. Many [meta-analyses] spend time bemoaning the lack of quality studies they are about to stuff into their study grinder. Then, despite knowing that the input is poor quality, the go ahead and make a sausage. The theory, as I said last week, is that if you collect many individual cow pies into one big pile, the manure transmogrifies into gold. I still think it as a case of GIGO: Garbage In, Garbage Out.

It has always been my understanding that a meta-analysis was used in lieu of a quality clinical trial. Once you had a few high quality studies, you could ignore the conclusions of a meta-analysis.

Not any longer, and not if you're Newsweek. Instead they prefer to ignore these substantive differences and merely inform us that one study has invalidated another. In fact they go so far as to make it sound as though the Cochrane review were an actual clinical study by telling us how many patients it "evaluated." In case you're wondering whether meta-analyses can be used as stand-ins for clinical trials, the NEJM published an article that explored just that:

We identified 12 large randomized, controlled trials and 19 meta-analyses addressing the same questions. For a total of 40 primary and secondary outcomes, agreement between the meta-analyses and the large clinical trials was only fair (kappa= 0.35; 95 percent confidence interval, 0.06 to 0.64). The positive predictive value of the meta-analyses was 68 percent, and the negative predictive value 67 percent. However, the difference in point estimates between the randomized trials and the meta-analyses was statistically significant for only 5 of the 40 comparisons (12 percent). Furthermore, in each case of disagreement a statistically significant effect of treatment was found by one method, whereas no statistically significant effect was found by the other.

Beyond this singular lack of understanding, Newsweek devotes many paragraphs to Ioannidis' work on statistical problems, yet only half of one sentence to the fact that his own statistical methods are controversial. In fact the sentence calling his methods controversial is buttressed by Newsweek telling us of Ioannidis' childhood genius and that his mathematical arguments are "abstruse"; evidently this complexity and his childhood genius were sufficient to convince the author of his accuracy and ought to be enough for the readers as well. (Handy rule of thumb: When not reading a profile you encounter a cute vignette from someone's childhood, the author is using it to paper over a logical deficiency.)

Indeed Newsweek spends a paragraph on Ioannidis' work discrediting statistical techniques (formerly) used in genetic attribution of disease and fails to tell us how that relates, at all, to medical studies. Indeed, they mention that it matters to the results genotyping companies give their customers, but how it relates to the article's thesis is ignored. This is the same as me telling you, "There are many problems with Mitsubishi automobiles" then spending a paragraph telling you about the dismal quality control conditions at Mitsubishi's television factories. It's the same aspersion by association and implication nonsense that I have written about before.

Perhaps the worst offense of all is that the article's central thesis is, "Everything you hear about medicine is wrong." Except that all of their reasons for discarding previously held findings are... new findings. Here they are telling us about vitamin E, "Two 1993 studies concluded that vitamin E prevents cardiovascular disease; that claim was overturned by more rigorous experiments, in 1996 and 2000." This is like a conspiracy theorist telling you that you can't trust anything a government official says and then quoting Robert Gibbs to evidence his opinion.

Don't get me wrong, there are plenty of problems with how modern science and medicine interact. Near the top of that list, though, are articles just like this. Consider that Newsweek has published literallydozensofarticlesaboutmedicalstudies that have not met the article's central criteria:

[M]edical wisdom that has stood the test of time—and large, randomized, controlled trials—is more likely to be right than the latest news flash about a single food or drug.

That Newsweek has the gall to publish a poorly reasoned article decrying the hype over recently released medical results without addressing, or even mentioning, their own habit of doing precisely that is a problem. A larger problem, though, is that the editors of Newsweek are blind to the hypocrisy of hyping Ioannidis' findings in an article about how often such findings turn out to be wrong.