We Need Investigative Science Journalism But Learning to Investigate Science is Hard–Part 2

Yesterday, I wrote about why many pieces about the need for investigative science journalism don’t acknowledge the factors behind its scarcity.Conversations about investigations in science journalism often seem to assume that reporters don’t see critiquing science as important, but journalists’ individual interests don’t set the tone for journalistic coverage all by themselves. In journalism, economics and politics shape our work. It’s basically impossible to disentangle why an article was written the way it was from simply reading the article.

Discussing the economics and politics that shape editorial decisions is a crucial part of addressing the relative absence of investigative science journalism, because for many reporters, quick hits– daily news stories, magazine blurbs, and blog posts– are our bread-and-butter (…when we get any bread at all…). Most casual readers encounter science via casual quick takes more often than they encounter it through investigative and/or longform articles.

There are definitely books and workshops out there to teach journalists how to chase down leads and verify what sources say, but at the end of the day, every investigative reporter I’ve met says that a lot of investigation is just spending a lot of time going through interviews and documents, looking for patterns. It’s very, very hard to justify putting in Spotlight-esque hours without a steady salary.

Unfortunately, the economic realities do not prevent verification issues from rearing their heads in short news stories…

The Case of the Mystery Supplements

I learned that fact the hard way when my MIT Science Writing class assigned me to write a 300-word piece about a recent nutrition study on calcium supplements. The researchers had done a meta-analysis, where they reviewed a whole bunch of studies on calcium supplements’ effects, re-crunched some of the numbers, and concluded that calcium supplements don’t protect against osteoporosis. (Which is basically the main reason people are encouraged to take calcium supplements.)

The “competing interests” are pretty innocuous as such things go: one of the study’s co-authors had received honorariums, small amounts of money, from a handful of large pharmaceutical companies. That’s not unusual among M.D.s, (a fact that most biology and medical journalists worth their salt know, even if they don’t discuss it as often as they should) and the co-authors say they didn’t use the honorarium funds for the study in question.

However, when I googled the companies listed–by Googling “[company name] + osteoporosis”–all four companies (Merck, Amgen, Lilly, and Novartis) had prescription osteoporosis treatments either in development or in early stages of marketing.

None of that proves that the researchers were biased or omitting key facts from their analysis. All four of those drug companies are humongous and have drugs both on the market and in development for a huge range of diseases. (And since osteoporosis is super-common, it makes sense that they would all want in on it.)

Still, it makes sense that companies peddling prescription anti-osteoporosis meds would have a vested interest in weakening the market for over-the-counter anti-osteoporosis supplements.

Given that figuring this out took me about five minutes— as a student with very little investigative journalism training and even less practice at it– so I figured that the consumer health journalists who cover topics like this for major newspapers would have caught it, too.

But I couldn’t find a single story about the 2015 calcium papers that did mention the honoraria. (So I wrote one for MIT SciWrite’s website.)

In the researchers’ defense, the evidence of calcium supplements actually helping looked pretty weak to my had-biostats-class-but-still-non-expert eyes. And since calcium supplements are a $12 billion dollar industry in the U.S., it stands to reason that the supplement companies would be very keen to make sure people think supplements work, regardless of whether they actually do.

Still, I was more than a little confused that none of the news coverage I had found mentioned the competing interests. Conventional journalism wisdom says that consumer health reporters at newspapers are more likely to have investigative chops than science-magazine-oriented types like yours truly (who sometimes get described as “cheerleaders“for science or medicine, often by journalists with newspaper health section backgrounds) .

But in order for all of that coverage by allegedly non-science cheerleader-y outlets to miss the disclosure, one of these four things had to happen in each case:

(a) The health and general assignment reporters didn’t know where to look for the conflict-of-interest disclosure.

(b) They knew where to look but just skipped it for this paper

(c) They saw the conflict-of-interest but didn’t think it was important enough to put in the story or mention to their editor

(d) They saw it and told the editor but the editor decided it wasn’t worth including.

Option (a) would vindicate the elitist part of my brain which loves to cackle, “See? Never send a general assignment schmuck to do a biomedical reporter’s work” but it’s actually the least likely option. (b) is more likely, since staff reporters are often rushed and finding the disclosure did require some scrolling. (c) is also possible since lots of doctors get honoraria and mentioning something like that to an editor can put your story in danger of being killed or canceled by the editor. (d) seems likely to me, because the possibility of bias makes the story more confusing, less headline-able, and less likely to draw clicks and eyeballs.

The truth is different outlets probably skipped over mentioning the competing interest for different reasons. However, seeing so many journalistic outlets fail to acknowledge an openly disclosed fact like that was eye-opening.

I spoke to my outside source for the MIT write-up in an empty conference room in between panels at the National Association of Science Writers 2015 meeting. All that morning, tensions between NASW’s key demographic groups– the journalists and public information officers (who write PR for universities & other institutions)– had been simmering. Basically, the PIOs consider themselves science writers, full stop. But journalists have to maintain a thin-but -bright line between themselves and PR writers, which is very hard to do without inadvertently implying (or even outright saying) that institutional science writers are less noble/awesome/crazy/ethical/cool.

So when my source began telling me that bone doctors have serious issues with how the team had gone about publicizing their anti-calcium-supplement results, it was a lot to evaluate. Basically, he thought the main researchers had taken their case against calcium supplements to the press rather than through medical organizations. (Their papers have actually come out in pretty darn reputable journals, but those journals are far from infallible.) They had also frequently implied that government agencies who recommend calcium supplements are acting due to influence from the supplement industry, not based on scientific evidence. Pretty serious accusations.

And since I don’t have investigative training, I had no idea what databases to check for hints of influence by supplement manufacturers on governments. (Still don’t, tbh…)

I tried a lean-on-experts maneuver and emailed a couple of potential third sources before scampering away to a PitchSlam. (It’s like speed-dating but with reporter pitching story ideas to editors.) The potential third sources never answered my emails.

My assignment was to write a 300-word story. My prof gave me permission to run slightly over length and said it might be worth trying to publish this on a bigger outlet than our program’s website, but I didn’t want to. I really, really didn’t want to. Even though I had been stressing and gathering evidence over several days, I had neither proof that the study was biased nor reliable counter-evidence that supplements actually do help.

When people start reading a short piece about nutrition, they usually want to walk away with an answer to the “Is this thing healthy for me?” question. Without that answer, I didn’t see much point in trying to remind people that medicine is full of ambiguous results and conflicts of interests. Without an answer to that question, I think it would have actually been unethical and misleading to publicize the story. (Feel free to disagree with me in the comments, but if there’s a chance that supplements are actually causing more harm than good, I don’t want to be the reporter who made it look like the researchers calling out the supplement guidelines are pharma shills…Maybe that’s just me.)

What Budding Investigative Science Reporters Are Up Against…

I’m sharing the calcium supplement backstory to make three points:

(1) Investigating medicine is time-consuming and confusing. The costs and benefits of many treatments are genuinely ambiguous. That doesn’t excuse science journalists for failing to investigate when lives are stake, but it does rankle me that so many people in comment sections talk about investigative medical journalism like it’s easy. It’s not.

And if you want to see more investigative journalism on health, environment, and science topics, acknowledging the time and effort that reporters put in helps. Seriously. We appreciate supportive notes and emails, but if you really want to make a big impact, share well-done investigative reports on social media and vote with your dollars to make sure outlets supporting that work stay in business.

(2) Science journalism has genres, and audiences of genres go in with expectations. Journalists have to write toward those expectations in order to stay in business. If you’ve ever skipped out on watching a movie because you weren’t in the mood for a superhero movie or a romcom or whatever, then the same principle probably applies to your news reading habits.

I realize this point is a smidge audience-blamey, but seriously: Retraction Watch, a site which specializes in reporting on scientific misconduct and research papers that have been retracted, has 6,790 likes on Facebook. Not bad for a niche publication, largely followed by scientists and science journalists. However, “I Fucking Love Science” has 24,315,083 likes. IFLS-likers outnumber Retraction-Watch-likers by about 3500:1 .

I know that’s just one example, but given that, are people really surprised that there are more gigs and jobs for science writers working the “Science is awesome!” narrative than the “Science is fraught with politics and statistical sleights of hand” narrative?

When we talk about why there isn’t more investigative science journalism, we need to be clear about whether the problem is journalists not wanting to “narc” on the scientists they admire or whether large portions of the science news audience simply don’t want to hear it.

(And yes. I meant what I said about NYT’s science section; NYT includes lots of investigative environmental and health reporting, but I stopped reading the NYT science page regularly about half way through undergrad precisely because it doesn’t address social influences on science. It felt like the NYT science section was talking to me like a kid, so I stopped going out of my way to read it.)

Making investigative science journalism more common will require finding a way to get in front of the people who don’t usually read the “Yay! Science”-news-genres. For the “yay, science!” fan base, we’ll need to find a way to make investigative pieces feel less like the news-equivalent of “yucky vegetables”. That means changing up science journalism genre norms, something that has to come from the editorial and audience ends of news making.

(3) Investigative findings often get cut or omitted to meet length constraints. As a professional writer, you write pieces at the length and style assigned. Investigations don’t fold down into 300-word blurbs very well.

However, short pieces are where the work is for writers in their early 20s. We may seem more confused because we grew up as writers in an era when the blogosphere, the journalism sphere, and the PR sphere all feed into the same news feeds and hashtags, but please don’t confuse our attempts to make sense of the digital news ecosystem as evidence that we don’t care about calling out wrongdoing in science.

Back in 2014, when Retraction Watch was a much smaller blog, they decided to hire their first intern. I saw the listing, figured “Ok, this blog isn’t well-known outside of science communication circles, so maybe a relatively clipless wonder like me will have a shot.” Within a few weeks of posting their first internship job listing, Retraction Watch got about 100 applicants.

(Will replace this screenshot with a less blurry version when I get the chance.)

100 applicants. For an internship at a relatively niche blog devoted to science admitting its fuck-ups.

Millennial science writers care about covering science’s weak points. If you haven’t seen us doing so, it may be because the opportunities for learning and practicing the investigative skillset are limited.

One more example before I go….

The reason I actually made this post a two-parter was because I wanted to share this example of a pitch that didn’t quite land. When I saw its abstract on the Science Translational Medicine section of EurekAlert, it immediately reminded me of the supplement story. See if you can spot why:

Researchers from University Hospital Muenster in Germany found evidence a strategy which multiple drug companies are testing in clinical trials may actually worsen rheumatoid arthritis, depending on what types of immune system signals are in the mix.
The goal of the treatment was to help osteoporosis-prone bones regain density by blocking a bone-growth-controlling molecule called sclerostin. However, the anti-sclerostin treatment actually sped up knee joint deterioration in rheumatoid arthritis model mice.
Not a good sign.
The researchers traced this effect back to an immune-signaling chemical called TNF-alpha. Since TNF-alpha gets released when the immune system fights bacterial and viral infections, theoretically getting a cold during treatment could dramatically alter the effects of the in-trial anti-sclerostin treatments. It also means that the treatment could hurt people who have chronic inflammation. "Humans are far more mixed [than lab mice]," the study's co-author Thomas Pap told me. "We need to be sure what role the TNF plays in the individual, because it may be very different."
TNF-alpha is rarely measured in clinical settings, so there's no large-scale epidemiology data (to my knowledge) that could predict how prevalent these bone deterioration side effects would be. Pap stressed that the drug companies need to be very careful going forward but also pointed out that his collaborators at Novartis were very open and receptive to the finding.
(The Novartis co-authors simply provided necessary antibodies and lab mice & weren't involved in carrying out the study. Novartis does have an in-house anti-sclerostin antibody project but Pap says he doesn't know how these results have affected its status.)
Amgen's anti-sclerostin drug has done well in Phase III clinical trials and is expected to debut on the market in 2017 (http://www.pmlive.com/pharma_news/amgen_says_romosozumab_on_track_for_2016_filing_938606). However, joint deterioriation is the sort of side effect that could very easily slip through the cracks in short clinical trials.
The result is a great example of *why** it's so difficult to predict side effects in clinical trials, and I'd like to get bioethics or drug development experts to weight in on what drug companies should do/actually do when you have a result that points to a possible side effect for a sizeable minority of the patient population.

Yup. The treatments being developed by the companies that gave honoraria to the calcium supplement researchers may have a widespread but very hard-to-predict side effect.

As far as immune signal molecules go, TNF-alpha is one of the most common and well-studied ones. I also think this study opens up an interesting question of How many unexplained side effects are due to variation in inflammatory signals?

But y’know, a lot of outlets reserve one-study stories for their staff writers, and this story is pretty far on the clinical insider end of the spectrum, anyway. Besides, plenty of biology studies turn out to be wrong. The anti-sclerostin treatments could be life-savers for a lot of people. Teasing out what this one finding might mean for a whole family of osteoporosis treatments would require an investigation.

But I’m a baby freelancer. To make rent, I gotta get back to looking for cool stories about new dinosaurs.