Steve Schneider juggled with the dilemma of conveying the urgency of climate change, while conveying how we can be confident about that when the evidence is sometimes uncertain Credit: L.A. Cicero, Stanford University News.

This is part three of this profile. You can also read part one and two.

“Each of us has to decide what the right balance is between being effective and being honest.” Those are words – seemingly advising his colleagues to lie to shape public opinion – credited to climate scientist Stephen Schneider in a 1989 Discover magazine article. Because of that quote, Steve was told, a number of US congressional hearings chose not to invite him as an expert witness.

With extreme irony, it came from an interview where Steve was expressing frustration after having been misrepresented in the media. He later argued that he was trying to explain how to succeed faced with the “double ethical bind” of being effective and honest in conveying both uncertainty and urgency. Even in the original article he adds, “I hope that means being both” – something often overlooked in scandalised reactions to his words. But perhaps the trouble this caused Steve in fact reflects the strangeness of the idea he dedicated much of his career to getting across. How can scientific projections with an element of uncertainty lead to the conclusion that we must act on climate change urgently?

We’re instinctively uncomfortable with uncertainty, and so Steve wanted clearer ways to get it across in the UN Intergovernmental Panel on Climate Change (IPCC)’s second climate change assessment report. At a meeting in 1994 he pushed for their estimates to come as probability distributions showing the odds for each in a range of different values. This could be done, he argued, for everything from damage likely to be caused by global warming to values for numbers central to natural processes, like climate sensitivity.

With no-one backing Steve, ambiguity crept into the report. Did everyone think that saying they had “high confidence” in a statement meant the same thing? After the second report, returning to his recently-appointed post at Stanford University in California, he was determined to hammer out any doubt. Together with the IPCC’s Richard Moss, Steve found 100 examples of inconsistent use of such terms. Armed with that shocking finding they persuaded the IPCC’s working group I, which discusses the physics of climate change, to define clear scales.

The result would be a double strategy, with verbal scales both for the probability of a forecast and for the reliability of the underlying science. For probabilities, low confidence meant a less than 1-in-3 chance, medium between 1-in-3 and 2-in-3 and high 2-in-3 and above. Very high confidence meant a 19-in-20 chance, and very low confidence 1-in-20. There were four grades for the quality of science, ranging from ‘speculative’ to ‘well established’. Reading through thousands of draft pages ensuring consistency in the run-up to the IPCC’s third report, published in 2001, Richard and Steve became known as the ‘uncertainty police’. Read the rest of this entry »

Pennsylvania State University’s Michael Mann thinks he has found the reason behind key outstanding disagreements between the historical temperature record based on tree rings and climate models for the same period. Credit: Pennylvania State University

The sudden chills violent volcano eruptions cast over the world centuries ago effectively erased themselves from the historical climate record produced by examining tree-rings. So suggests a team led by Michael Mann from Pennsylvania State University, who famously used 1,000 years of tree-ring measurements in the “hockey stick” graph showing how unusual today’s temperatures are. Michael warns the skipped years could affect scientists’ estimates of how much the world warms in response to greenhouse gases in the atmosphere, known as its climate sensitivity. But other than the volcano years, the scientist notes that tree-ring data is a remarkably accurate match with the climate models they used for comparison. “Interestingly, the effect has little influence on long-term trends, including conclusions about how previous temperatures compares to modern ones,” he told me. “Instead, it appears only to have implications for how strong past short-term cooling events were.”

A tree’s age can usually be told from the rings that form across its trunk representing each year’s growth. How thick each ring is shows how much the tree grew in the year in question, which is influenced by the temperatures that tree experienced. That means examining the thickness of rings in old trees can provide a way to tell temperatures back through history. Many challenges have already been overcome in turning this simple-sounding idea into a history of the world’s temperature, but Michael was still troubled by one particular detail. Read the rest of this entry »

Sunrise over the Atlantic Ocean, 300 metres below which heat may have been stored during pause in warming during the 2000s, new models indicate. Credit:NCAR/Carlye Calvin

“The fact is that we can’t account for the lack of warming at the moment and it is a travesty that we can’t.” When that admission of climate science’s limitations was made public, it was seized upon as an important failing by the discipline’s critics. But actually the author of the “Climategate” email leaked from the University of East Anglia and printed worldwide in November 2009, was identifying a problem that needed to be fixed. Kevin Trenberth from the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, was really complaining about the fact that scientists couldn’t track energy flows linked to short-term variations in climate well enough. That shortcoming meant that they were stumped by a particularly relevant question at the time: why temperatures hadn’t risen enough in response to the overall amount of energy flowing into the planet since 2003. Now, in a letter published in the research journal Nature Climate Change last week, he and his colleagues have come a step nearer to revealing the reason why.

Before 2003, Trenberth had found, satellite measurements of the balance of energy flowing in and out at the top of Earth’s atmosphere matched the amount of heat that could be measured in the land, atmosphere and sea. After that, though the satellites showed greenhouse gases are continuing to trap more heat from the sun’s energy, only about half that heat could be measured. Consequently, though the 2000s were Earth’s warmest decade in over a century of records the rapid temperature rises that brought this about ground to a standstill from 2002-2009. During that period average global temperatures decreased, although this trend fails the key scientific “statistical significance” test, and 2010 was again the warmest year on record.

This prompted Trenberth and his fellow researcher John Fasulloto ask in top journal Science “Where has this energy gone?” in April 2010. Yet, Trenberth and Fasullo did have an idea what the answer might be, their colleague Gerald Meehl underlined. “Following Kevin and John’s work on this topic using observations, it seemed like heat had to be going into the deep ocean when the surface temperature trend was flat for a decade or so,” he told Simple Climate. “But we don’t have adequate observations of deep ocean temperatures to quantify that. So we decided to look at a climate model to see if it could answer that question.” Read the rest of this entry »

Do you trust the measurements your bathroom scales provide? If not, you share a classic dilemma faced by scientists, which is whether or not their data gives an accurate picture of what they are trying to study. This is an especially sore point in the debate between climate scientists and their critics who differ, for example, over whether it’s OK to exclude temperature measurements from certain weather stations. One accusation is that scientists are just choosing the figures that support their arguments, a practice referred to as “cherry picking”.

Such disputes made me especially interested to see that Max Planck Institute for Developmental Biology scientist George Wang and his colleagues had specifically chosen a set of temperature measurements for their research. Together with Michael Dillon at the University of Wyoming and Raymond Huey at the University of Washington, Wang looked at how temperatures since 1960 would have affected metabolism of ectotherms – better known as “cold-blooded creatures”. They surprisingly found that, despite temperatures changing more slowly where they live, tropical species would be worse affected than those living in cold areas. Could choice of weather stations have influenced this research improperly? Not according to Wang.

“Scientists will exclude data for many reasons, but fundamentally we do it because blindly leaving data in an analysis can bias results,” he told Simple Climate. “In general, scientists in every field have to use judgement based on experience to detect and remove outliers. It is something scientists take very seriously, and it is an integral part of analysing data.” Read the rest of this entry »

Temperatures have risen in the debate between the different US political parties over climate change in recent years - and now the country's weathercasters appear to be more influenced by that than their professional qualifications in their opinions of global warming. Credit: Weather Central

Many US TV weathercasters responded to last November’s “Climategate” scandal more on the basis of political beliefs than meteorology, scientists have claimed. That’s important, Edward Maibach of George Mason University in Fairfax, Virginia, notes, because of their comparatively high profile and large audiences. “TV meteorologists are potentially an important source of informal climate change education in that most American adults watch local TV news and consider TV weather reporters a trusted source of global warming information,” Maibach and colleagues write. “At least temporarily, Climategate has likely impeded efforts to encourage some weathercasters to embrace the role of climate change educator.”

“Climategate” refers to the publication of hacked emails from the Climatic Research Unit (CRU) in East Anglia – which is jointly responsible for one of just three global temperature records. Among these were statements that suggested climate researchers may have inappropriately tampered with and illegally avoided sharing data, and tried to suppress other scientists’ work. Since then a seriesofenquiries found that the conclusions of their research are not in doubt, although they did fail to display the proper degree of openness. Read the rest of this entry »

Stanford scientists project that from 2030 to 2039, most areas of Arizona, Utah, Colorado and New Mexico could endure at least seven seasons equally as intense as the hottest season ever recorded between 1951 and 1999. Credit: 'jhadow' via Flickr/Creative Commons

As New York reached record temperatures this week, Stanford researchers claimed that such events could become increasingly common over the next 30 years. “Using a large suite of climate model experiments, we see a clear emergence of much more intense, hot conditions in the U.S. within the next three decades,” said Stanford’s Noah Diffenbaugh.

Together with with co-author Moetasim Ashfaq, Diffenbaugh uses these results to suggest that the global warming threshold often seen as dangerous is too simple. That 2°C limit mainly comes from political negotiations, the scientists note in a Geophysical Research Letterspaper published online ahead of printing last week. “The intensification of hot extremes reported here suggests that constraining global warming to 2°C above pre-industrial conditions may not be sufficient to avoid dangerous climate change,” they write. Read the rest of this entry »

Global temperatures haven’t risen as much as the amount of energy the planet has absorbed since 2003 would suggest, but the rises may just have been delayed. That’s according to Kevin Trenberth and John Fasullo of the US National Center for Atmospheric Research (NCAR).

In a paper published in Science yesterday they compare satellite data measuring the amount of energy from the sun entering the atmosphere and returning back into space. Since 2000, the Earth has absorbed around 0.9 watts per square metre more at the top of the atmosphere than it has released. “It is this imbalance that produces global warming,” Trenberth and Fasullo write. Read the rest of this entry »