Gulf Coast Guessing Game

Fresh wave of articles highlight uncertainty about lingering oil

More scientific criticism of a government report that attempted to calculate the amount oil left in the Gulf of Mexico spurred a fresh wave of articles this week, which highlighted just how little is known about lingering threats to the ecosystem.

On August 4, the National Oceanic and Atmospheric Administration (NOAA) and the Department of the Interior released an “oil budget,” which found that about 74 percent of the spilled oil has been dealt with by capture, skimming, burning, evaporation, dissolution, and dispersion. The remainder, the report said, is onshore, still in the water, or buried at the bottom of the Gulf. The New York Times, which obtained an early copy of report, ran a front-page story announcing the ostensible good news, but failed to quote any scientists other than the head of NOAA, Jane Lubchenco. A day later, however, the media was awash with reports of scientists expressing skepticism about the government’s findings.

This week, a handful of reports from independent scientists, as well as oil-spill hearing on Capitol Hill, led to a new round of criticism, and the federal oil budget looks more dubious than ever. But, as the many articles have pointed out, competing research from critics is also subject to a high degree of uncertainty, and easy answers may continue to elude authorities and the public for a long time.

“Researchers at the University of Georgia said Monday that more than three-quarters of the oil spilled in the Gulf of Mexico following the Deepwater Horizon drilling-rig explosion could still be in the Gulf threatening fisheries and marine life, disputing government statements that much of the oil had been safely dispersed,” The Wall Street Journal’s Robert Lee Hotz reported Tuesday. The article points out the scientists from Georgia have been “at the forefront” of investigating underwater oil plumes created by the spill, but cautions, appropriately, that their study hasn’t been published or peer-reviewed, and that, “Both the UGA assessment and the federal calculations it contradicts are estimates based on incomplete information.”

Peter Spotts, at the The Christian Science Monitor, had a good explanation of how the calculations by federal authorities and the University of Georgia team differed. But, as Mother Jones’s Kate Sheppard observed two weeks ago, the federal report, “doesn’t include much in the way of specifics on the supporting data used to reach these conclusions.”

According to follow-up post on Wednesday, Sheppard repeatedly asked NOAA to provide supporting data for the federal “oil budget,” which went unfulfilled. She noted that she was not alone in that regard, however, and posted an absolutely shocking chain of e-mail correspondence between a congressional investigator (who asked not to be named) and Michael Jarvis, a congressional affairs specialist at NOAA. The investigator spent over a week trying to get Jarvis to provide specific data and calculations that NOAA and the Department of the Interior had used for the oil budget, and was eventually told they were unavailable.

The same investigator, it would appear, popped up in a good New Orleans Times-Picayunearticle on Tuesday, telling the reporter that the “level of obfuscation” surrounding the oil budget would never be accepted if the report were presented for publication in an academic journal. But it gets worse. Mother Jones’s Sheppard followed up again on Thursday, reporting that, “On Wednesday afternoon, during a telephone briefing on the spill for congressional staffers, NOAA scientists said the data might not be available for at least two months.”

In a hearing of the House Committee on Energy and Commerce on Thursday (which Sheppard did an excellent job covering via Twitter – hashtag: #BPhearing), Florida State University oceanographer Ian MacDonald complained, “It’s impossible for someone reading this report to check the numbers, and we have concerns about the numbers.” Likewise, Rep. Ed Markey (D-Mass.) said that his efforts to obtain details about the calculations have also been in vain. According to a story by The Huffington Post’s Dan Froomkin, Markey said that federal report had given many people a sense of “false confidence” and that NOAA “shouldn’t have released” it if it wasn’t prepared to make its data and calculation methods public.

Meanwhile, scientists from the University of South Florida said Tuesday that they had found evidence that microscopic droplets of oil have collected on the floor of an undersea canyon off the Florida Panhandle that is an important spawning ground for several species of fish. The team discovered unhealthy phytoplankton—which play an important role as part of the foundation of the ocean food web—in the area, blaming oil and dispersants for the damage. The Associated Press, Reuters, McClatchy, and National Geographic covered the findings, noting that while the research results were preliminary and have not been peer-reviewed, they constitute yet another challenge to the rosy picture presented by the federal government.

In a related development, scientists from Woods Hole Oceanographic Institution announced Thursday that they had discovered a 22-mile-long, 1.2-mile-wide, and 650-foot-high plume of microscopic oil compounds floating in the vicinity of the Macondo well, which was finally capped in mid July [Clarification, 8/20: It is important to note that the Woods Hole team made its measurements from June 19 to 28, before BP capped the well. And while the plume contained relatively high levels of toxins and didn’t appear to be biodegrading quickly, according to a news article by Science’s Richard Kerr, scientists both involved and uninvolved in the study have emphasized that the plume was not as massive as ones seen earlier.]

The health and safety of Gulf seafood was the other big topic of discussion at Thursday’s hearing on Capitol Hill. The upshot of the conversation wasn’t entirely clear at press time, but in Twitter coverage of the event, Markey berated the FDA for not monitoring seafood in oiled areas; Florida State University’s MacDonaled said, “Survival of the Gulf seafood industry requires the survival of seafood;” and Dean Blanchard, the president of Gulf seafood company said his product undergoes an abundance of safety testing. More will undoubtedly roll in on Friday and over the weekend.

Getting all the information that the public needs won’t be easy for journalists, however. On Tuesday, The Daily Beast had a great scoop about fisherman in the Vessels of Opportunity program who were told to keep quiet after finding recently finding more tar balls and oil in the Gulf (one of the fisherman created a YouTube video showing his crew dipping a clean rag into the Gulf three-quarters of a mile off the Mississippi coast and pulling it out a couple minutes later soaked in oil).

In a similar vein, Greenwire ran a troubling story on Thursday about how both federal authorities and BP have been hiring expert scientists to assess oil spill’s impacts on the Gulf, but making them sign non-disclosure agreements in the process.

It’s a frustrating situation, to be sure, but reporters must do their best to continue chipping away at the uncertainty and obfuscation.

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

Curtis Brainard writes on science and environment reporting. Follow him on Twitter @cbrainard.

Don't Miss

In June 2003, the San Francisco company Linden Labs launched a Massively Multiplayer Online Role Playing Game called Second Life. It quickly grew to over a million users, and has become a touchstone for the potential social adoption...