Tag Archives: charney sensitivity

In my post “A Big Number Gets Tweaked” I focused on ‘climate sensitivity’, aka the global mean surface temperature response to a doubling of CO2. It is an important number, and a basic understanding of what it means is a basic part of what I would call ‘climate change literacy’.

Going back to the Intergovernmental Panel on Climate Change (IPCC)’s Assessment Report 4 (AR4) published in 2007, a definition of climate sensitivity can be found on page 12 of the Summary for Policy Makers here.

The equilibrium climate sensitivity is a measure of the climate system response to sustained radiative forcing. It is not a projection but is defined as the global average surface warming following a doubling of carbon dioxide concentrations. It is likely to be in the range 2°C to 4.5°C with a best estimate of about 3°C, and is very unlikely to be less than 1.5°C. Values substantially higher than 4.5°C cannot be excluded, but agreement of models with observations is not as good for those values. Water vapour changes represent the largest feedback affecting climate sensitivity and are now better understood than in the TAR. Cloud feedbacks remain the largest source of uncertainty.

The chart below gives a sense of the different sensitivity estimates that provided the background to the IPCC’s final number:

This definition of climate sensitivity dates back to a landmark paper by Jule Charney et al in 1979 (here). In fact, to avoid confusion, we could call it Charney sensitivity. Now what Charney sensitivity isn’t (surprisingly) is the real world sensitivity of surface temperatures to a doubling of CO2. This is because Charney sensitivity was a blending of the results of two climate models that held a number of the variables constant. Of course, the Charney sensitivity in its modern version is now backed up by a multitude of models of far greater sophistication, but interestingly the sensitivity number that came out of the 30-year old Charney report has held up pretty well. Nonetheless, the Charney sensitivity has a somewhat narrow definition. The excellent climate scientist run blog RealClimate (www.realclimate.org) explains this in more detail here:

The standard definition of climate sensitivity comes from the Charney Report in 1979, where the response was defined as that of an atmospheric model with fixed boundary conditions (ice sheets, vegetation, atmospheric composition) but variable ocean temperatures, to 2xCO2. This has become a standard model metric (because it is relatively easy to calculate. It is not however the same thing as what would really happen to the climate with 2xCO2, because of course, those ‘fixed’ factors would not stay fixed.

A wider definition is usually termed the Earth System sensitivity that allows all the fixed boundary conditions in the Charney definition to vary. As such, ice sheets, vegetation changes and atmospheric composition can provide feedbacks to temperature and thus cause a greater temperature response over the longer term. The Earth System sensitivity is in theory closer to the real world as it tells us at what temperature the system will ultimately get back to equilibrium.

The most influential calculation of Earth System sensitivity has been that made by NASA’s Jim Hansen, since it forms the scientific foundation for the 350.org climate change campaigning organisation. As the name suggests, 350.org urges humanity to strive toward a target of 350 parts per million (ppm) of CO2. The rationale for the target can be found here and rests heavily on a paper by Jim Hansen and his coauthors entitled “Target atmospheric CO2: Where should humanity aim?“.

In the abstract of the Hansen article, we immediately see a differentiation between a sensitivity that includes only fast feedback processes (a Charney sensitivity) and an equilibrium sensitivity that includes slower feedbacks (an Earth System sensitivity):

Paleoclimate data show that climate sensitivity is ~3°C for doubled CO2, including only fast feedback processes. Equilibrium sensitivity, including slower surface albedo feedbacks, is ~6°C for doubled CO2 for the range of climate states between glacial conditions and ice-free Antarctica. Decreasing CO2 was the main cause of a cooling trend that began 50 million years ago, the planet being nearly ice-free until CO2 fell to 450 ± 100 ppm; barring prompt policy changes, that critical level will be passed, in the opposite direction, within decades.

The paper then goes on to make a pretty forceful policy recommendation:

If humanity wishes to preserve a planet similar to that on which civilization developed and to which life on Earth is adapted, paleoclimate evidence and ongoing climate change suggest that CO2 will need to be reduced from its current 385 ppm to at most 350 ppm, but likely less than that.

Note that the article does contain a number of caveats over climate variability, climate models and other uncertainties. Further, as is the usual process in science, it has received various critiques, many suggesting that a figure of 6 degree Celsius is too high for long term sensitivity. What is not in dispute, however, is that, an Earth System sensitivity with long-term feedbacks will have a higher sensitivity number than a Charney sensitivity with only short-term feedbacks (almost by definition).

Despite this fact, we see numerous media reports getting tangled up between the two types of sensitivities following the publishing of the new Schmittner et al paper I talked about in a previous post. This from the Houston Chronicle:

To me, the real effect of this paper will be to really impair the credibility of the more extreme environmentalists who have been saying the planet faces certain doom from climate change.

I am thinking about such efforts as Bill McKibben’s 350 campain, in which he asserts that 350 ppm is the most important number in the world. Such environmentalists assert that the planet will warm as much as 6 Celsius degrees with a doubling of atmospheric carbon dioxide levels.

That’s a big number and doubtless would have catastrophic consequences for the planet. This is not in dispute. But scientists are now telling us this is not going to happen.

Well ‘no’ actually. Since we are comparing apples and pears, scientists are not now telling us that catastrophic outcomes are not going to happen.

Getting back to the topic of risk, we can now see how a better understanding of the different sensitivity concepts allows ordinary people to get a better idea of the climate risk they and their families face.

To reiterate, we are going from CO2, to temperature (via sensitivity) to impacts. To get a good idea of overall risk we need a sense of of how carbon emissions are trending; then we need a feeling for how sensitive temperature is to CO2; and lastly an understanding of how much the earth changes (and the impact on us of those changes) once the world warms.

The Charney sensitivity is very useful since it gives a floor to the kind of temperature changes we will experience. If the best estimate of this sensitivity number if found in the future to be smaller than the current consensus of 3 degrees, then that—other things being equal—is a positive thing. However, we are not in a position, yet, to reduce the consensus based on the Schmittner paper.

The Hansen 6 degree Celsius number is probably a little too high, but if we get anywhere close to this number, we are still in the bad lands of catastrophic climate change. Nonetheless, the time horizon for the full warming stretches generations into the future; thus, it is probably not the risk metric you would use if your concern only goes our as far out as grandchildren. But I think Jim Hansen receives a lot of underserved ridicule in certain parts of the blogosphere and American press for his championing of a number that implies the yet unborn have rights too.

Putting this question of human ethics to one side, those alive today are really interested in a Charney sensitivity plus alpha from a climate risk perspective. The components that make up that ‘plus alpha’ are a topic for another post.