Search form

News aggregator

I’m happy to announce the release of paprica v0.4.0. This release adds a number of new features to our pipeline for evaluating microbial community and metabolic structure. These include:

NCBI taxonomy information for each point of placement on the reference tree, including internal nodes.

Inclusion of the domain Eukarya. This was a bit tricky and requires some further explanation.

The distribution of metabolic pathways, predicted during the creation of the paprica Eukarya database, across transcriptomes in the MMETSP.

Eukaryotic genomes are a totally different beast than their archaeal and bacterial counterparts. First and foremost they are massive. Because of these there aren’t very many completed eukaryotic genomes out there, particularly for singled celled eukaryotes. While a single investigator can now sequence, assemble, and annotate a bacterial or archaeal genome in very little time, eukaryotic genomes still require major efforts by consortia and lots of $$.

One way to get around this scale problem is to focus on eukaryotic transcriptomes instead of genomes. Because much of the eukaryotic genome is noncoding this greatly reduces sequencing volume. Since there is no such thing as a contiguous transcriptome, this approach also implies that no assembly (beyond open reading frames) will be attempted. The Moore Foundation-funded Marine Microbial Eukaryotic Transcriptome Sequencing Project (MMETSP) was an initial effort to use this approach to address the problem of unknown eukaryotic genetic diversity. The MMETSP sequenced transcriptomes from several hundred different strains. The taxonomic breadth of the strains sequenced is pretty good, even if (predictably) the taxonomic resolution is not. Thus, as for archaea, the phylogenetic tree and metabolic inferences should be treated with caution. For eukaryotes there are the additional caveats that 1) not all genes coded in a genome will be represented in the transcriptome 2) the database contains only strains from the marine environment and 3) eukaryotic 18S trees are kind of messy. Considerable effort went into making a decent tree, but you’ve been warned.

Because the underlying data is in a different format, not all genome parameters are calculated for the eukaryotes. 18S gene copy number is not determined (and thus community and metabolic structure are not normalized), the phi parameter, GC content, etc. are also not calculated. However, eukaryotic community structure is evaluated and metabolic structure inferred in the same way as for the domains bacteria and archaea:

Lamont graduate student Kira Olsen and Meredith Nettles report that glacial earthquakes in Greenland, a measure of ice loss from the leading edges of glaciers, increased in frequency by a factor of four over the period 1992-2013.

Suzana Camargo comments on a report that during times of frequent Atlantic hurricanes, climate conditions tend to weaken storms that approach the U.S. east coast, whereas during times of less frequent tropical storms, major hurricanes approaching the U.S. are likely to intensify before making landfall.

Scientists are expected to announce that 2016 was the hottest year on Earth since record-keeping began in 1880 ​— news that will test national, state and economic leadership on climate change. “The climate system gives not a hoot about politicians in Washington denying the reality of human-driven climate change — but it does respond to decisions on energy, fuels and the environment those politicians make,” Lamont's Richard Seager said.

Greenland's Petermann Ice Shelf has lost huge ice islands since 2010. The question is no longer whether it is changing — it’s how fast it could give up still more ice to the seas. Chris Mooney talks with scientists, including Lamont's Marco Tedesco, about what they're seeing.

The Lamont-operated R/V Marcus G. Langseth is in Chile with teams of scientists studying the region's offshore seismicity. El Mercurio wrote about the work as a magnitude 7.7 earthquake struck off the Chilean coast. The article is in Spanish.

Researchers including Lamont's Paul Richards say a 2010 event previously thought to be a small nuclear test in North Korea was actually just a small earthquake – a finding that could have implications for monitoring the regime's nuclear tests.

When you rub your hands together to warm them, the friction creates heat. The same thing happens during earthquakes, only on a much larger scale: When a fault slips, the temperature can spike by hundreds of degrees, high enough to alter organic compounds in the rocks and leave a signature. A team of scientists at Columbia University’s Lamont-Doherty Earth Observatory has been developing methods to use those organic signatures to reconstruct past earthquakes and explore where those earthquakes started and stopped and how they moved through the fault zone. The information could eventually help scientists better understand what controls earthquakes.

Lamont geophysicist Heather Savage and geochemist Pratigya Polissar began developing the methods about eight years ago, building on techniques used by the oil industry. Their unique pairing of two fields – rock mechanics and organic geochemistry – made possible innovations that are changing how we look at earthquakes.

The process starts in the field, along a fault where scientists either chip off or drill samples from inside the fault zone. When sediments in a fault zone are heated by the friction of an earthquake, that short but powerful burst of heat alters the chemical composition of organic material inside the rock. (The same process over long periods of time creates oil and gas.) Scientists can examine the organic compounds in those samples and compare the ratio of stable molecules to unstable molecules to measure their thermal maturity and determine how hot each sample became.

“If even a tiny structure within a fault has had an earthquake, we can actually see the difference between how hot that piece of the fault got versus everything outside of it,” Savage said. “What we want to figure out is where the earthquakes in this big fault zone were actually happening. Do they all happen to one side? Are the distributed throughout? Are they all clustered on the weakest material within the fault zone?”

“What this does is give us a picture, almost like a heat map, of the fault itself, and the hottest places are where the earthquakes happened,” Savage said.

Heather Savage’s team sampled all along the Muddy Mountain thrust in Nevada and made some surprising discoveries. Photo: Heather Savage

When temperatures are high enough, rock can melt, creating glass-like pseudotachylytes. Geologists have used these melted rock remnants for several years, but finding them is rare.

Savage, Polissar, and their team are looking closer, to the molecular level, where they can measure the thermal maturity of common organic compounds to determine how hot the sample became. They often test for methylphenanthrenes, organic molecules that are fairly common in faults within sedimentary rocks between 1 and 5 kilometers below ground. In deeper faults, some 10-14 kilometers down, the scientists can look for diamondoids, which are among the most thermally stable organic compounds.

To put their molecular data into context, the scientists also need to understand how rocks in the fault react to heat and pressure. In Lamont’s Rock and Ice Mechanics Lab, Savage’s team can test rock samples under a wide range of high pressures and temperatures. From their experiments, they can develop models that show how much shear stress and displacement are required to generate specific levels of heat in specific types of rock, and then how that heat will decay through diffusion.

Using these models, the scientists can then look at the geochemical analysis of their samples, determine the temperatures the compounds were exposed to in the past, and estimate the friction from the earthquake and how far the fault slipped.

For example, when the team tested samples from the Pasagshak Point megathrust on Alaska’s Kodiak Island, they measured the ratio of thermally stable diamondoids to thermally unstable alkanes and determined that the temperature during a past earthquake would have risen between 840°C and 1170°C above the normal temperature of the surrounding rock. From that temperature rise, they were able to estimate that the earthquake’s frictional energy would have been 105-227 megajoules per square meter, likely a magnitude 7 or 8 earthquake. Using their experimental friction measurements, they could then estimate that the fault must have slipped 1-8 meters.

At the American Geophysical Union Fall Meeting today in San Francisco, Genevieve Coffey, a graduate student in Savage’s team at Lamont, presented early results from their highest-density testing yet, involving samples taken in transects along the Muddy Mountain thrust in Nevada. One surprise was that the places where one might expect to see high temperatures because of the local structures in the rock were not necessarily the locations where they found it, Coffey said. “Structural variability along a fault does not necessary indicate that slip has occurred along that section,” she said.

Savage’s team is working on similar experiments at the San Andreas fault, and the Japan trench where the Tōhoku earthquake began, and they are working with colleagues on techniques to date the earthquakes.

“The important step for us is to determine how each of those compounds reacts to time and temperature,” Savage said. “That’s going to tell us about the physics of the earthquakes in that fault, which in the long run could lead to a better understanding of earthquake hazards.”

This map of large earthquakes outlines some of the most active tectonic plate boundaries. Slow-slip earthquakes create an ideal lab for investigating fault behavior along the shallow portion of subduction zones. Map: USGS

Off the coast of New Zealand, there is an area where earthquakes can happen in slow-motion as two tectonic plates grind past one another. The Pacific plate is moving under New Zealand at about 5 centimeters per year there, pulling down the northern end of the island as it moves. Every 14 months or so, the interface slowly slips, releasing the stress, and the land comes back up.

Unlike typical earthquakes that rupture over seconds, these slow-slip events take more than a week, creating an ideal lab for studying fault behavior along the shallow portion of a subduction zone.

In 2015, Spahr Webb, the Jerome M. Paros Lamont Research Professor of Observational Physics at Lamont-Doherty Earth Observatory, and an international team of colleagues became the first to capture these slow-slip earthquakes in progress using instruments deployed under the sea. The data they collected from the New Zealand site, published this year by lead author Laura Wallace of the University of Texas, will help scientists better understand earthquake risks, particularly at trenches, the seismically active interfaces between tectonic plates where one plate dives under another. Members of the team are discussing their work this week at the American Geophysical Union (AGU) Fall Meeting.

“We don’t yet understand the stickiness of the interface between the two plates, and that is partly what determines how big an earthquake you can have,” Webb said. “In particular, we care about the stickiness near the trench, because when you have a lot of motion near a trench, you can generate big tsunamis.”

Previously, scientists thought that the soft sediments piled up near trenches were usually not strong enough to support an earthquake and that they would dampen the slip, Webb said. “We’re recently seen a lot of big tsunamis where there has been large slip right close to the trench,” he said.

One reason the 2011 Tōhoku earthquake in Japan was so devastating was that part of the interface very close to the trench moved a large distance, around 50 meters, pushing the water with it, Webb said. While the main part of the Tōhoku earthquake involved uplift of only a few meters, the part near the trench doubled the size of the tsunami, leading to waves almost 40 meters high at some points along the coast.

To be able to anticipate tsunami-producing earthquakes and more accurately assess regional risks, scientists are studying why some areas of trenches have these slow-slip events, why others continuously creep, and others lock up and build strain that eventually erupts as a tsunami-generating earthquake.

Webb has his sights next on the Aleutian Trench, just off Kodiak Island, Alaska. It is one of the most seismically active parts of the world. A large tsunami-generating earthquake there could wreak havoc not only in Alaska but along the west coast of North America and as far as Hawaii and Japan, as the Good Friday earthquake did in 1964.

Lamont scientists, including Donna Shillington and Geoffrey Abers, who are also presenting their work this week at AGU, have spent years studying the structure of the Aleutian Trench and what happens as the Pacific plate dives beneath the North American plate. Webb and a large group of collaborators now want to find out where sections of the trench are sliding and where sections are locking to help understand what determines where it locks. Finding slow-slip earthquakes could help reveal some of those secrets.

To study the New Zealand slow-slip event, Webb and his colleagues installed an array of 24 absolute pressure gauges and 15 ocean-bottom seismometers directly above the Hikurangi Trough, where two plates converge. Absolute pressure gauges deployed on the seafloor continuously record changes in the pressure of the water above. If the seafloor rises, pressure decreases; if the seafloor moves downward, pressure increases due to the increasing water depth. When the slow-slip event began, the instruments recorded how the seafloor moved.

The scientists found that parts of the Hikurangi interface slipped and others didn’t during the slow-slip event. “It may be that much of the interface slips in these events but you have a few places that are locked, and those finally break and create earthquakes and tsunamis that cause damage,” Webb said.

Most of the instruments used in the New Zealand study were built at Lamont in the OBS (ocean-bottom seismometer) lab started by Webb.

In Alaska, Webb and his collaborators have proposed an experiment that would again use a large numbers of Lamont-built ocean-bottom seismometers and pressure gauges, this time to collect data near Kodiak Island. Alaska is a special challenge for seafloor measurements. The ocean is quite shallow south of Alaska before deepening near the Aleutian Trench, and seismic instruments on the seafloor can be moved by strong currents or damaged by bottom trawling. Webb and the team in the OBS lab at Lamont developed a solution: they built heavy metal shields that sink to the sea floor with the seismometers to protect them.

Once data from the instruments are collected, they will be made publicly available so seismologists across the country can begin to analyze the records in search of clues to the area’s earthquake behavior.

“If we start seeing precursors based on the off-shore data, then maybe we’ll also get some predictive ability,” Webb said. “The hope is if you have better off-shore measurements, you’ll start to understand things better, and maybe there is some sign of motion happening before the earthquake that will provide some warning.”

This map of large earthquakes outlines some of the most active tectonic plate boundaries. Slow-slip earthquakes create an ideal lab for investigating fault behavior along the shallow portion of subduction zones. Map: USGS

Off the coast of New Zealand, there is an area where earthquakes can happen in slow-motion as two tectonic plates grind past one another. The Pacific plate is moving under New Zealand at about 5 centimeters per year there, pulling down the northern end of the island as it moves. Every 14 months or so, the interface slowly slips, releasing the stress, and the land comes back up.

Unlike typical earthquakes that rupture over seconds, these slow-slip events take more than a week, creating an ideal lab for studying fault behavior along the shallow portion of a subduction zone.

In 2015, Spahr Webb, the Jerome M. Paros Lamont Research Professor of Observational Physics at Lamont-Doherty Earth Observatory, and an international team of colleagues became the first to capture these slow-slip earthquakes in progress using instruments deployed under the sea. The data they collected from the New Zealand site, published this year by lead author Laura Wallace of the University of Texas, will help scientists better understand earthquake risks, particularly at trenches, the seismically active interfaces between tectonic plates where one plate dives under another. Members of the team are discussing their work this week at the American Geophysical Union (AGU) Fall Meeting.

“We don’t yet understand the stickiness of the interface between the two plates, and that is partly what determines how big an earthquake you can have,” Webb said. “In particular, we care about the stickiness near the trench, because when you have a lot of motion near a trench, you can generate big tsunamis.”

Previously, scientists thought that the soft sediments piled up near trenches were usually not strong enough to support an earthquake and that they would dampen the slip, Webb said. “We’re recently seen a lot of big tsunamis where there has been large slip right close to the trench,” he said.

One reason the 2011 Tōhoku earthquake in Japan was so devastating was that part of the interface very close to the trench moved a large distance, around 50 meters, pushing the water with it, Webb said. While the main part of the Tōhoku earthquake involved uplift of only a few meters, the part near the trench doubled the size of the tsunami, leading to waves almost 40 meters high at some points along the coast.

To be able to anticipate tsunami-producing earthquakes and more accurately assess regional risks, scientists are studying why some areas of trenches have these slow-slip events, why others continuously creep, and others lock up and build strain that eventually erupts as a tsunami-generating earthquake.

Webb has his sights next on the Aleutian Trench, just off Kodiak Island, Alaska. It is one of the most seismically active parts of the world. A large tsunami-generating earthquake there could wreak havoc not only in Alaska but along the west coast of North America and as far as Hawaii and Japan, as the Good Friday earthquake did in 1964.

Lamont scientists, including Donna Shillington and Geoffrey Abers, who are also presenting their work this week at AGU, have spent years studying the structure of the Aleutian Trench and what happens as the Pacific plate dives beneath the North American plate. Webb and a large group of collaborators now want to find out where sections of the trench are sliding and where sections are locking to help understand what determines where it locks. Finding slow-slip earthquakes could help reveal some of those secrets.

To study the New Zealand slow-slip event, Webb and his colleagues installed an array of 24 absolute pressure gauges and 15 ocean-bottom seismometers directly above the Hikurangi Trough, where two plates converge. Absolute pressure gauges deployed on the seafloor continuously record changes in the pressure of the water above. If the seafloor rises, pressure decreases; if the seafloor moves downward, pressure increases due to the increasing water depth. When the slow-slip event began, the instruments recorded how the seafloor moved.

The scientists found that parts of the Hikurangi interface slipped and others didn’t during the slow-slip event. “It may be that much of the interface slips in these events but you have a few places that are locked, and those finally break and create earthquakes and tsunamis that cause damage,” Webb said.

Most of the instruments used in the New Zealand study were built at Lamont in the OBS (ocean-bottom seismometer) lab started by Webb.

In Alaska, Webb and his collaborators have proposed an experiment that would again use a large numbers of Lamont-built ocean-bottom seismometers and pressure gauges, this time to collect data near Kodiak Island. Alaska is a special challenge for seafloor measurements. The ocean is quite shallow south of Alaska before deepening near the Aleutian Trench, and seismic instruments on the seafloor can be moved by strong currents or damaged by bottom trawling. Webb and the team in the OBS lab at Lamont developed a solution: they built heavy metal shields that sink to the sea floor with the seismometers to protect them.

Once data from the instruments are collected, they will be made publicly available so seismologists across the country can begin to analyze the records in search of clues to the area’s earthquake behavior.

“If we start seeing precursors based on the off-shore data, then maybe we’ll also get some predictive ability,” Webb said. “The hope is if you have better off-shore measurements, you’ll start to understand things better, and maybe there is some sign of motion happening before the earthquake that will provide some warning.”

The Arctic is warming twice as fast as the rest of the planet, and scientists are seeing the effects across ice and ecosystems. The average annual air temperature over Arctic land is now 3.5°C (6.5°F) warmer than it was 1900, Greenland is experiencing longer melting seasons, and this year’s spring snow cover extent set a record low in the North American Arctic, according to the National Oceanic and Atmospheric Administration’s newly released 2016 Arctic Report Card.

Understanding these changes is becoming increasingly important for economic planning and national security in many countries, as well as for the daily lives of people whose communities are directly affected.

Marco Tedesco, a glaciologist at Columbia University’s Lamont-Doherty Earth Observatory and the lead author of the Arctic Report Card chapter on Greenland, described some of the changes he has been seeing in Greenland during a news conference today for the report’s release during the American Geophysical Union Fall Meeting in San Francisco.

The Greenland Ice sheet, which holds enough water to raise global sea level by an estimated 6 meters (20 feet), has been losing mass since at least 2002, when satellite measurements of the ice sheet began. Its melting season has also been starting earlier and lasting longer. In 2016, the start of Greenland’s annual summer surface melt was the second earliest in the 37 years that satellites have been observing the ice sheet, Tedesco said.

“The melting season is getting longer, invading spring and fall, and it’s preconditioning the next melting season,” Tedesco said.

In Northeast Greenland, the melting season lasted 30 to 40 days longer this year than the 1979-2016 average, and it was 15 to 20 days longer along the West Coast, he said. That longer melting period can precondition future melting, creating what Tedesco has referred to as “melting cannibalism”. Melting changes the properties of snow and ice when it refreezes, creating larger grains that “darken” the ice surface, allowing it to absorb more radiation when exposed and exacerbate future melting. (Tedesco discusses some of the feedback mechanisms driving melting in Greenland in the video above.)

Warming air is also creating problems in other parts of the Arctic by thawing normally frozen permafrost. That can create structural instabilities for roads and buildings, and it can release carbon dioxide and methane, powerful greenhouse gases. “We estimate the tundra is releasing net carbon to the atmosphere,” Tedesco said.

Permafrost areas hold large stores of organic carbon, equivalent to about twice the amount currently in the atmosphere, according to the Report Card. In a study published two weeks ago, Lamont post-doctoral scientist Francesco Muschitiello and colleagues showed what warming has done to permafrost carbon stores in the past. They presented the first direct physical evidence of a massive release of organic carbon from permafrost during a warming spike at the end of the last glacial period, documenting how Siberian soil that was once locked in permafrost was carried into the Arctic Ocean during the end of the last glacial period at a rate about seven times higher than recent times.

Greenland’s ice can “darken” in ways we can see and ways we can’t. Photo: Marco Tedesco

The Arctic Report also describes how Arctic sea ice has also been getting younger and thinner, leaving patches of dark open water that can absorb more solar radiation and warm the water. This summer, Arctic sea ice fell to its second-lowest minimum extent since satellite record keeping began in 1979, and in November it registered a record low for the month, according to the Report Card.

In the Native Village of Kotzebue, just north of the Arctic Circle in Alaska, the predominantly Iñupiaq community relies on the seasonal sea ice and the marine mammals that live on the ice pack, and its members have seen changes in the ice in recent years, including the sea ice forming later and breaking up earlier in the year. The community is now partnering with Lamont oceanographer Christopher Zappa on a research project that will put drones to work studying changes in the changing sea ice and ocean ecosystem. Zappa’s team, with support from The Gordon and Betty Moore Foundation, will be using airborne remote sensing technology to study the spring melt process, the ice’s stability, ocean circulation in the area, and marine biological processes.

Lamont scientists are currently studying several areas of the Arctic to increase our understanding of the changes that are underway now and those ahead.

In Alaska’s interior, Lamont ecologist Natalie Boelman and plant physiologist Kevin Griffin have been working near the tree line, where boreal forest ends and tundra begins. They are part of a team that is combining satellite observations with fieldwork to better understand the affects warming will have on Arctic region’s vegetation and trees, including risks from insect infestations and wildfire.

Joerg Schaefer, Nicolas Young, and William D’Andrea and have been analyzing geochemical changes in Greenland’s bedrock and evidence in sediment from glacier-fed lakes to understand how the region’s ice cover responded to warming in the past. Schaefer recently found evidence in bedrock drilled from two miles below the summit of the Greenland Ice Sheet that suggests the ice sheet might not be as stable as believed.

Tedesco and Lamont geophysicists Indrani Das and Robin Bell also led a workshop earlier this year with scientists from around the world to discuss ways to improve understanding of the fundamental processes controlling the surface mass balance of the Greenland Ice Sheet. Tedesco will be presenting details from the workshop at the American Geophysical Union Fall Meeting later this week.

“Rarely have we seen the Arctic show a clearer, stronger or more pronounced signal of persistent warming and its cascading effects on the environment than this year,” said Jeremy Mathis, director of NOAA’s Arctic Research Program. “While the science is becoming clearer, we need to improve and extend sustained observations of the Arctic that can inform sound decisions on environmental health and food security, as well as emerging opportunities for commerce.”

Scientists issue their 2016 Arctic Report Card finding that the Arctic as a whole is warming at least twice as fast as the rest of the planet, and it is getting progressively worse. The cause of the warming is in part due to feedback loops, as Lamont's Marco Tedesco explains.

Most research databases are narrowly focused. They might contain only seismic data from earthquakes, for example, or chemical data from volcanic rocks. The Interdisciplinary Earth Data Alliance (IEDA) set out to create a different kind of research experience, and the result is fueling groundbreaking multi-disciplinary discoveries worldwide.

Created and managed by scientists at Lamont-Doherty Earth Observatory, IEDA brings together diverse datasets from across geochemistry and marine geoscience into one system. Importantly, it provides the tools that allow scientists from a wide range of fields to easily search for and explore relationships among many different kinds of data.

“Through IEDA, scientists can find the natural samples, the composition, the geochemistry of the samples. If you need to know the structure of the crust underneath those samples, you can get to the seismic data. You can check if there are experimental results for chemical composition from close to these rocks that can tell you where they come from. Are there any dated rocks? Where is geochronology in this particular area? The data starts to be networked, and it comes together in IEDA,” Lehnert said.

Before IEDA, these kinds of data were largely inaccessible, often stored on scientists’ local computers, in their lab notebooks, or fragmented throughout the scientific journals. By bringing the data together in an easily searchable format, IEDA has created a way for researchers to quickly access thousands of values for analysis and comparison. Two scientists were recently able to document a link between deep Earth geochemistry and a rise in oxygen in Earth’s atmosphere by downloading 70,000 samples of continental igneous rock geochemistry from IEDA. Finding all the data would have taken years before IEDA was created.

“Integrating different kinds of observations and observations made from many different regions in order to gain a global perspective is a powerful way to gain new insights into science problems,” said IEDA Associate Director Suzanne Carbotte, a marine geophysicist and Bruce C. Heezen Lamont Research Professor.

Geochemists use IEDA data in their research and contribute to its growing databases. Photo: Lamont-Doherty Earth Observatory.

Transforming 21st Century Science

IEDA combines EarthChem, the world’s largest geochemistry database, with the Marine Geoscience Data System, which serves data for studies of seafloor and deeper crust and mantle processes.

EarthChem started as the petrology database PetDB at Lamont in 1996, when relational databases were just beginning to be developed. Today it includes several partner databases and some 400,000 samples and 20 million analytical values from across geochemistry, along with the tools to mine the collections. The EarthChem Portal also connects with other large databases, including Germany’s GeoRock, a database in Japan, and the U.S. Geological Survey’s national geochemical database, allowing IEDA users to search across all these major databases at once.

The Marine Geoscience Data System traces its origins to the early 1990s, when Lamont oceanographer Bill Ryan launched a first-of its-kind web-accessible database of seafloor bathymetry data. Building upon this early resource, the Marine Geoscience Data System serves a wide range of marine geoscience data collected by research ships and other platforms, including data back to 1954. It includes global bathymetry data, seafloor imagery, seismic data that provide cross-sectional views beneath the seafloor, as well as other multidisciplinary data from a series of national research programs.

The IEDA system is also uniquely equipped to incorporate smaller, niche data sets, which it then makes open and accessible through interactive, map-based interfaces and other tools.

One focus is compiling what are known as “long-tail” data: what scientists do with observational data in their labs and how they analyze it.

“A lot of what people do with different data sets in their labs can be very innovative and unique and new,” said Vicki Ferrini, an oceanographer who works with IEDA’s marine geophysics data. “It all gets out through scientific publications, interpretations of it get out, but actually making the data that supports those publications accessible and reusable and into something that can be built upon is what we’re really aspiring to do.”

The Marine Geoscience Data System’s high-resolution images provide detailed views of sea mounts like these and other sections of the seafloor. About 8 percent of the seafloor has been mapped to 100-meter resolution like this. Source: GeoMapApp

Understanding the Data Needs of Scientists

Part of IEDA’s success stems from its team’s close connections to science. Lehnert, Carbotte, and many of the IEDA team members are scientists who are intimately familiar with scientific workflows and how scientists search for and analyze data. They know what scientists need and how to customize solutions and incorporate different types of results.

In addition to the repositories and analysis tools, IEDA has identification systems that link published papers to their original data and samples. Openness of data is critical to scientists’ ability to test theories and reproduce results, and data management plans are now required by the National Science Foundation, which supports IEDA. IEDA also makes data and samples available for reuse so scientists don’t have to collect the same kinds of data from the same location again, saving time and money.

“IEDA builds upon Lamont’s rich legacy of acquiring diverse multidisciplinary data to address science questions that dates back to the earliest days of Doc Ewing and the globally ranging expeditions of Lamont ships,” Carbotte said. Maurice “Doc” Ewing, Lamont’s founding director, ordered all ocean expeditions to routinely collect diverse sets of geoscience and oceanographic data. When new scientific questions arose, data and samples were often there for analysis.

“We’re making the data available, and people can take the data for whatever they need,” Lehnert said.