Facebook

Google

Twitter

These maps show the changes in air temperatures over land as measured using thermometers (left side) and as calculated by the 20th Century Reanalysis project (left side). While more than 80 percent of the observed variation is captured by the computer model, the results show interesting differences in some regions such as the midwestern United States, Argentina and eastern Brazil. The differences may be due previously unrecognized issues with the pressure observations, variations in land use and land cover over time, issues with temperature measurements or some of the conditions set in the computational model.

In making the case for global climate warming, researchers rely on long-term measurements of air temperature taken around the world, all at a height of two meters or six-and-a-half-feet. However, some climate researchers and others point to those same measurements as an inconsistent record, and therefore as unreliable indicators of actual temperature variability and long-term trends.

But in a paper published recently in Geophysical Research Letters, a team of scientists led by Gil Compo of the University of Colorado and the National Oceanic and Atmospheric Administration’s (NOAA) Earth System Research Laboratory in Boulder, Colorado report that land surface air temperatures estimated using a number of other historical observations, including barometric pressure, but not directly using the land temperature measurements themselves, nevertheless largely match the existing temperature measurements spanning the years 1901 to 2010.

“This is really the essence of science,” Compo said. “There is knowledge ‘A’ from one source and you think, ‘Can I get to that same knowledge from a completely different source?’ Since we had already produced the dataset, we looked at just how close our temperatures estimated using barometers were to the temperatures using thermometers.”

The team used the Hopper supercomputer at NERSC to carry out their calculations, part of the ongoing 20th Century Reanalysis project which was awarded 8 million hours at NERSC.

“We use a completely different approach to investigate global land warming over the 20th century,” Compo wrote in the abstract of the paper featured on the cover of the June 28 issue of the journal. “We have ignored all air temperature observations and instead inferred them from observations of barometric pressure, sea surface temperature, and sea-ice concentration using a physically based data assimilation system called the 20th Century Reanalysis. This independent data set reproduces both annual variations and centennial trends in the temperature data sets, demonstrating the robustness of previous conclusions regarding global warming.”

When looking at the land surface air temperatures, critics of the idea that human activity is a factor in global warming note that the land surrounding the measuring stations has often changed significantly since the early 20th century. Agriculture has replaced forests and prairies, farmland has been developed and small towns have grown into cities. Compo and his team also note that stations have moved, and both instruments and techniques for measuring temperature have changed.

Combined, these factors can undermine confidence in estimates of past, present and future climate change. In fact, Compo said, at least five research papers assessing the validity of the land surface air temperatures were published in the last two years, and the lengthy author lists for those papers indicate this is an active topic for discussion. “The perception is that there is something wrong with the instruments and that’s why there appears to be warming,” he said.

The 20th Century Reanalysis Project led by Compo is an ambitious program that uses barometric pressure and other climate-related data from the past 150 years to reconstruct the world’s climate from its weather using computer models. By culling information from newspapers, scientific observations, ships’ logs and other sources, the international team and their colleagues in the Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative and other efforts have assembled a comprehensive set of observations to examine how weather and climate have changed. By plugging this data into climate prediction models, Compo and his team have successfully “predicted” how the earth’s weather and climate have varied since the late 1800s. The goal is to demonstrate the accuracy of these models for predicting future warming patterns.

To perform their check of the accuracy of the land surface air temperatures, Compo and his team realized that they could get most of the answers from their data given the monthly sea surface temperatures and monthly sea ice distribution – two key factors affecting climate over land areas. This yielded seasonal, annual and decadal patterns. By adding in barometric pressure data, they are able to reconstruct even the day-to-day and month-to-month variations.

Although other researchers have used the team’s dataset representing climate conditions from 1979 to the present to look at temperature, Compo said this research is unique as they used the reanalysis data stretching back to the early 20th century. The results, he said, are clear.

“If, for some reason, you didn’t believe global warming was happening, this confirms that global warming really has been occurring since the early 20th century,” Compo said.

In addition to Compo, the authors of the paper are Prashant D. Sardeshmukh of the University of Colorado and NOAA Earth System Research Laboratory, Boulder, Colo.; Jeffrey S. Whitaker of the NOAA Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder; Philip Brohan of the Met Office Hadley Centre, UK; Philip D. Jones of the Climatic Research Unit, University of East Anglia, Norwich, UK; and Chesley McColl of the University of Colorado and NOAA Earth System Research Laboratory, Boulder, Colo.

The Lawrence Berkeley National Laboratory (Berkeley Lab) Computing Sciences organization provides the computing and networking resources and expertise critical to advancing the Department of Energy's research missions: developing new energy sources, improving energy efficiency, developing new materials and increasing our understanding of ourselves, our world and our universe.

ESnet, the Energy Sciences Network, provides the high-bandwidth, reliable connections that link scientists at 40 DOE research sites to each other and to experimental facilities and supercomputing centers around the country. The National Energy Research Scientific Computing Center (NERSC) powers the discoveries of 6,000 scientists at national laboratories and universities, including those at Berkeley Lab's Computational Research Division (CRD). CRD conducts research and development in mathematical modeling and simulation, algorithm design, data storage, management and analysis, computer system architecture and high-performance software implementation. NERSC and ESnet are DOE Office of Science User Facilities.

Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the DOE’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.