As noted in previous posts, this analysis stems from a paper I wrote for a graduate science communication course. Having to examine the literature and take a stance, I concluded that embargoes of science news and the use of the Ingelfinger rule place constraints on disseminating science and health information to the public.

Embargoes and the Ingelfinger rule delay people from receiving scientific information necessary for personal and democratic decision-making. Some journals even withhold releasing an article to compete with others (Altman, 1996).

When a specific body of work is accepted by a journal for publication, the information may fall under the realm of the embargo system. That is, some journals keep hush about upcoming articles and dictate dates when journalists can release their coverage of a study in exchange for early access. Kiernan (1997) writes that embargoes and the Ingelfinger rule not only hinder the public from receiving information, but they also limit research from reaching policy makers who enact laws and regulations that shape policies on health, science and the environment.

Of course, science news leads obtained beyond the embargo system aren’t constrained by these rules, but they’re becoming increasingly rare with the growth of online journalism and news organizations’ shrinking budgets. Indeed, true leads may be hard to come by for some journalists who have a slew of other responsibilities.

Even then, not everyone enforces the same rules. Science and medical journals are inconsistent in their treatment of violations of the embargo system and the Ingelfinger rule, resulting in journalists and scientists who are unnecessarily cautious about sharing scientific information. Ultimately, this freezes the flow of information, as scientists may avoid the press in order to ensure getting published. Adding to the problem, Altman (1996) argues the Ingelfinger rule “reinforces the medical profession’s long-standing distrust of journalism.” Although between six and seven studies were pulled each year during Franz Ingelfinger’s time at The New England Journal of Medicine, the number of scientists found in violation of the Ingelfinger rule is not clear at this point (Culliton, 1972). A similar situation has evolved for embargoes, as treatment of violations remains inconsistent and poorly reported until recent years (now the topic is center stagefor some blogs).

In cases in which a journal decides not to enforce the Ingelfinger rule because of the perceived immediacy of findings, scientists may publicize their results and share them with journalists before they are published (Kassirer and Angell, 1994). Again, much of this is at the discretion of scientists and journal editors — not journalists.

Ivan Oransky, executive editor at Reuters Health, says punishing embargo violations has become increasingly inconsistent in recent years. The movement from print-based empires to online cultures has made it difficult to track and punish embargo breaks. In some cases, whether the embargo was broken by a small or large news organization will factor in, while in others, it depends on whether other publications follow suit. Such inconsistency is frustrating to science journalists who do their best to play by the rules.

At times, the journal cycle increases the amount of time it takes for medical findings to reach patients. Photo by Dvortigirl/Flickr.com

Former Scientific American editor in chief and journalism instructor John Rennie also says embargoes’ inconsistent enforcement stunts journalists who are unsure of the consequences. In most cases, journals will temporarily revoke a journalist’s access to embargoed content if the person violates an embargo.

In addition, some argue the embargo system and the Ingelfinger rule give rise to deceiving portrayals of science by focusing on individual studies rather than bodies of research or science trends. Embargoes negatively shape society’s ideas of science and research, Kiernan (1997) writes, distorting how the public perceives science as a process. Rather than acknowledging the amount of time dedicated to a specific study, embargoes focus on the “newness” of the results. In other words, the public is often left without an idea of how long a given project took or even whether it experienced any setbacks during the experimental process.

Schuchman and Wilkes (1997) suggest the public might learn more about the research process if journalists focused on “ongoing stories,” which would require them to follow up on topics previously covered. This is especially important if only preliminary findings were available at the time the topic received coverage. Delving deeper into research methodology would also help the public understand the strength of certain studies over others. They also write that medical journals place too much emphasis on single articles through the embargo system. Press releases touting a single study’s results contribute to this problem. Yet the responsibility of ensuring quality coverage lies with the journalist, not the individual drafting the press release. Although journalists recognize the artificial “newness” of science news being published, it is still coveted and emphasized among science journalists and their editors.

As a result, Rennie believes science coverage hypes the newness of ideas at the detriment of other facets the public may deem relevant or fascinating. One reason the public might be receiving mixed messages about the scientific process results from journalists’ neglect in covering retracted research or journals’ avoidance in publishing negative results. Although retractions haven’t garnered considerable attention in science communication literature, one study by Roy Rada (2007) found that only three of 50 retracted published papers and press releases received attention from news outlets. Rada adds that most journals have inconsistent policies about what ought to be done after studies are retracted. Some journals may release the retraction with a press release while others will print a small notice in the following issue. The fact that the three retracted studies were associated with fraud and scientific misconduct demonstrates journalists’ preferences for what should reach attention: deception. Studies that were retracted because of less controversial errors did not get scooped up, even if their revisions provided information of importance to public health.

This trend, albeit from a small data set, contributes to journalists’ neglect to follow up on research findings and present science as a process rather than fragments of breakthroughs.

In my final post, we’ll look at alternatives to embargoes and Ingelfinger — or at least steps to guide science journalists in the right direction.