On sensitivity: Part I

January 3rd, 2013 by gavin

And then there are the recent papers examining the transient constraint. The most thorough is Aldrin et al (2012). The transient constraint has been looked at before of course, but efforts have been severely hampered by the uncertainty associated with historical forcings – particularly aerosols, though other terms are also important (see here for an older discussion of this). Aldrin et al produce a number of (explicitly Bayesian) estimates, their ‘main’ one with a range of 1.2ºC to 3.5ºC (mean 2.0ºC) which assumes exactly zero indirect aerosol effects, and possibly a more realistic sensitivity test including a small Aerosol Indirect Effect of 1.2-4.8ºC (mean 2.5ºC). They also demonstrate that there are important dependencies on the ocean heat uptake estimates as well as to the aerosol forcings. One nice thing that added was an application of their methodology to three CMIP3 GCM results, showing that their estimates 3.1, 3.6 and 3.3ºC were reasonably close to the true model sensitivities of 2.7, 3.4 and 4.1ºC.

In each of these cases however, there are important caveats. First, the quality of the data is important: whether it is the LGM temperature estimates, recent aerosol forcing trends, or mid-tropospheric humidity – underestimates in the uncertainty of these data will definitely bias the CS estimate. Second, there are important conceptual issues to address – is the sensitivity to a negative forcing (at the LGM) the same as the sensitivity to positive forcings? (Not likely). Is the effective sensitivity visible over the last 100 years the same as the equilibrium sensitivity? (No). Is effective sensitivity a better constraint for the TCR? (Maybe). Some of the papers referenced above explicitly try to account for these questions (and the forward model Bayesian approach is well suited for this). However, since a number of these estimates use simplified climate models as their input (for obvious reasons), there remain questions about whether any specific model’s scope is adequate.

Ideally, one would want to do a study across all these constraints with models that were capable of running all the important experiments – the LGM, historical period, 1% increasing CO2 (to get the TCR), and 2xCO2 (for the model ECS) – and build a multiply constrained estimate taking into account internal variability, forcing uncertainties, and model scope. This will be possible with data from CMIP5, and so we can certainly look forward to more papers on this topic in the near future.

In the meantime, the ‘meta-uncertainty’ across the methods remains stubbornly high with support for both relatively low numbers around 2ºC and higher ones around 4ºC, so that is likely to remain the consensus range. It is worth adding though, that temperature trends over the next few decades are more likely to be correlated to the TCR, rather than the equilibrium sensitivity, so if one is interested in the near-term implications of this debate, the constraints on TCR are going to be more important.

104 comments on this post.

Guido van der Werf:

January 3rd, 2013 at 12:03 PM

Great overview, thanks. I think it is important to stress that with the current growth of fossil fuel emissions we are above the highest IPCC emissions scenario (RCP 8.5), at least for fossil fuel combustion. If this persists in the future we will be in the 3 degree range in 2100 even with the lowest CS estimates.

January 3rd, 2013 at 12:40 PM

Gavin Schmidt, could you maybe have a look at a catastrophic “paper” by the economist Alan Carlin that got lost in the scientific literature? One of his reasons to claim that “the risk of catastrophic anthropogenic global warming appears to be so low that it is not currently worth doing anything to try to control it” is that he uses a very low value for the climate sensitivity based on non-reviewed “studies”, while ignoring the peer-reviewed work.

[Response: As pointed out by Hank in the comment below, we’ve already wasted enough of our neurons on Carlin. See here. –eric]

January 3rd, 2013 at 1:01 PM

Very illuminating, thank you. I agree with Guido, and would add that it would be helpful to stress how critical constraining ECS is. It’s not necessarily obvious to the uninitiated what a huge effect this ~2ºC uncertainty in ECS estimates has on scenarios that attempt to predict the magnitude and timing of climate change impacts (e.g. the AR5 RCPs). Also I found some possible minor typos:

“strongly dependent [on?] still unresolved issues”

“has been looked at before of course, but [efforts?] have been severely hampered”

“are more likely to follow to be correlated to the TCR” [remove “to follow”?]

It also might be helpful to spell out Aerosol Indirect Effect.

[Response: thanks! – gavin]

January 3rd, 2013 at 1:09 PM

Chris G:

January 3rd, 2013 at 2:26 PM

I think what really matters are the changes we can expect in the world in terms of livability, including the ability to grow adequate food. Using the change in temperature from the last glacial maximum as a ruler, if the change in temperature from that is near 6 Kelvin, then 2 K warmer than pre-industrial means a certain level of effects. If that change is only 4 K, then 2 K warmer means a higher level of change ecologically.

The 2 K limit generally accepted as a dangerous limit would mean something different if the change in temperature from the last GM is 6 K (approximately 1/3 change again) versus 4 K (approximately 1/2 change again). In other words, if climate sensitivity is toward the low end, 2 K is more dangerous than we currently give it credit for, and arguments for low risk because of low sensitivity are less valid because that means that more ecological changes occur for a given temperature change than currently thought.

David B. Benson:

January 3rd, 2013 at 3:11 PM

Quite helpful Gavin. Well done.

Tom Scharf:

January 3rd, 2013 at 4:30 PM

A useful post.

Correct me if I am wrong, but this appears to be walking back the CS numbers a bit. ~3C seems to be heading towards ~2.5C. I am encouraged as I have been a somewhat vocal critic that when models have been over estimating the temps fairly consistently that it somehow wasn’t translating into lower CS estimates, or constraining the upper range. Many forcings were being twiddled to account for observations (namely aerosols), but the main CO2 forcing seemed to be the third rail.

[Response: Huh? Forcing is not the same as sensitivity. For reference, GISS-ModelE had a sensitivity of 2.7ºC, and GISS-E2 has sensitivity of 2.4ºC to 2.8ºC depending on version. All very mainstream. – gavin]

January 3rd, 2013 at 4:33 PM

Thanks for this crucial science on sensitivity. A crucial subject for understanding. This leads to questions about micro-sensitivity. People are thinking about their individual impact to global warming. Generalities of carbon footprint try to package the message but fail. We might want to know the impact of specific actions.

Certainly operating a carbon fueled car has real consequences, although for any one vehicle they are very slight. A single cylinder emission is the lowest unit of micro-sensitivity. One person in a car might have a few million per day – or a pound of CO2 per mile traveled. It’s like pissing in a trout stream, one person may not do more harm than scare away the fish, but with millions along the shores all streaming away all day, pretty soon it is the yellow river of death.

Somewhere we need to measure cognitive sensitivity to human impacts of global warming. Perhaps the visual display would be like a car’s tachometer – it would measure ineffectual use of CO2. The micro-sensitivity meter would indicate how effectively carbon fuel is used to deploy clean energy. It would sit right on the dashboard.

SecularAnimist:

January 3rd, 2013 at 5:20 PM

Tom Scharf wrote: “models have been over estimating the temps fairly consistently”

That’s simply not true.

“In this post we will evaluate this contrarian claim by comparing the global surface temperature projections from each of the first four IPCC reports to the subsequent observed temperature changes. We will see what the peer-reviewed scientific literature has to say on the subject, and show that not only have the IPCC surface temperature projections been remarkably accurate, but they have also performed much better than predictions made by climate contrarians.”

wili:

January 3rd, 2013 at 5:45 PM

This was just posted at SkSc:

“Time-varying climate sensitivity from regional feedbacks

Abstract:”The sensitivity of global climate with respect to forcing is generally described in terms of the global climate feedback—the global radiative response per degree of global annual mean surface temperature change. While the global climate feedback is often assumed to be constant, its value—diagnosed from global climate models—shows substantial time-variation under transient warming. Here we propose that a reformulation of the global climate feedback in terms of its contributions from regional climate feedbacks provides a clear physical insight into this behavior. Using (i) a state-of-the-art global climate model and (ii) a low-order energy balance model, we show that the global climate feedback is fundamentally linked to the geographic pattern of regional climate feedbacks and the geographic pattern of surface warming at any given time. Time-variation of the global climate feedback arises naturally when the pattern of surface warming evolves, actuating regional feedbacks of different strengths. This result has substantial implications for our ability to constrain future climate changes from observations of past and present climate states. The regional climate feedbacks formulation reveals fundamental biases in a widely-used method for diagnosing climate sensitivity, feedbacks and radiative forcing—the regression of the global top-of-atmosphere radiation flux on global surface temperature. Further, it suggests a clear mechanism for the ‘efficacies’ of both ocean heat uptake and radiative forcing.”

Abstract: “Understanding how global temperature changes with increasing atmospheric greenhouse gas concentrations, or climate sensitivity, is of central importance to climate change research. Climate models provide sensitivity estimates that may not fully incorporate slow, long-term feedbacks such as those involving ice sheets and vegetation. Geological studies, on the other hand, can provide estimates that integrate long- and short-term climate feedbacks to radiative forcing. Because high latitudes are thought to be most sensitive to greenhouse gas forcing owing to, for example, ice-albedo feedbacks, we focus on the tropical Pacific Ocean to derive a minimum value for long-term climate sensitivity. Using Mg/Ca paleothermometry from the planktonic foraminifera Globigerinoides ruber from the past 500 k.y. at Ocean Drilling Program (ODP) Site 871 in the western Pacific warm pool, we estimate the tropical Pacific climate sensitivity parameter (λ) to be 0.94–1.06 °C (W m−2)−1, higher than that predicted by model simulations of the Last Glacial Maximum or by models of doubled greenhouse gas concentration forcing. This result suggests that models may not yet adequately represent the long-term feedbacks related to ocean circulation, vegetation and associated dust, or the cryosphere, and/or may underestimate the effects of tropical clouds or other short-term feedback processes.”

That had been my impression, but perhaps this is the result of selective reading on my part.

[Response: I would need to check, but I think this is a constraint on the Earth System Sensitivity – not the same thing (see the first figure). – gavin]

January 3rd, 2013 at 8:35 PM

I’m increasingly thinking that what we really need is an estimate of the sensitivity of the system to an injection of carbon dioxide including the feedback from the carbon cycle etc. I suppose that is the Earth System Sensitivity in this terminology. Using sensitivities where carbon dioxide concentrations is an exogenous variable could underestimate the cost of emissions impacts.

January 4th, 2013 at 4:38 AM

Surely the models described are all lagging behind the real world. The CMIP5 models seem to predict an Arctic free of summer sea ice in a few decades but the real world trend is for this to happen in the next few summers.

So why should policy makers care what these models predict as climate sensitivity? I suppose it is an interesting scientific problem but we should bear in mind that most or all of them are on the optimistic side.

[Response: James — thanks. One of my papers disappeared this way too. Mildly annoying! I added a link in the post which we’ll remove once AGU sorts things out. –eric]

Alex Harvey:

January 4th, 2013 at 6:40 AM

Thanks for this interesting post.

January 4th, 2013 at 11:34 AM

If “sensitivity” is the response to a given injection of CO_2, how can we measure this directly when the CO_2 level is constantly increasing?

[Response: That isn’t the point. Sensitivity is a measure of the system, and many things are strongly coupled to it – including what happens in a transient situation (although the relationship is not as strong as one might think). The quest for a constraint on sensitivity is not based on the assumption that we will get to 2xCO2 and stay there forever, but really just as a shorthand to characterise the system. Thus for many questions – such as the climate in 2050, the uncertainties in the ECS are secondary. – gavin]

Ric Merritt:

January 4th, 2013 at 1:14 PM

Geoff Beacon #13: You may indeed be able to cite cases where models are “lagging behind the real world”. Arctic sea ice measurements below a past prediction do constitute such a case. But comparing different *future* predictions of “an Arctic free of summer sea ice” cannot, logically, be cited today as a discrepancy between a past prediction and the real world, as measured. Please don’t confuse these 2 situations, which are quite different. If you want to bet on an ice-free Arctic, by some appropriate definition, by some date in a couple years, you can probably find a place to do it, but that’s a different thing from pointing out how a past prediction missed something in the real world.

I’d expect to see the Arctic essentially free of ice during September within three years.

What’s your bet?

Neven’s Sea Ice Blog has some pieces that will help :

January 4th, 2013 at 5:03 PM

> Ideally, one would want to do a study across all
> these constraints with models that were capable of
> running all the important experiments – the LGM,
> historical period, 1% increasing CO2 (to get the TCR),
> and 2xCO2 (for the model ECS) – and build a multiply
> constrained estimate taking into account internal
> variability, forcing uncertainties, and model scope.
> This will be possible with data from CMIP5 ….

How soon? Is there any coordination among those doing this, before papers get to publication, so you know what’s being done by which group, and all the scientists are aware of each other’s work so they, taken as a group, can nail down as many loose ends as possible?

Jim Larsen:

January 4th, 2013 at 5:10 PM

Volume has a more immediate signal than extent. In other words, measuring extent masks the problem. Since we now can talk with either term, it is a disservice for the IPCC to speak extent. I suggest the whole sea ice section be re-written with a volume-centric view. I’m betting all those “models more or less worked for extent up to 2011” would turn into “models were way off on volume through 2012”.

January 4th, 2013 at 5:11 PM

Oops, I see that’s been answered:

http://www.metoffice.gov.uk/research/news/cmip5
“… (CMIP5) is an internationally coordinated activity to perform climate model simulations for a common set of experiments across all the world’s major climate modelling centres….
…. and deliver the results to a publicly available database. The CMIP5 modelling exercise involved many more experiments and many more model-years of simulation than previous CMIP projects, and has been referred to as “the moon-shot of climate modelling” by Gerry Meehl, a senior member of the international steering committee, WGCM…..”

Lennart van der Linde:

January 4th, 2013 at 5:16 PM

Do I understand correctly that this paper suggests a current CS of about 4 degrees C and earth system sensitivity of about 5 degrees, and seems to rule out CS-values lower than 3 degrees?

They also speak about sea level sensitivity as being higher than current ice sheet models show. It seems about 500 ppm CO2 could eventually mean an ice free planet, much lower than the circa 1000 ppm that ice sheet models seem to estimate.

Any thoughts on this approach and these conclusions?

January 4th, 2013 at 7:53 PM

Splendid word, I’d guess a typo, in the Hansen conclusion:

“16×CO2 is conceivable, but of course governments would not be so foolhearty….”

January 5th, 2013 at 4:24 AM

Gavin

This is not the subject but it seems that, in AR5 (sorry it is the leaked version), the mean total aerosol forcing is less (30%) than this same forcing in AR4.(-0.9W/m2 against -1.3W/m2)
On this link, http://data.giss.nasa.gov/modelforce/RadF.txt ,NASA-GISS provides a total aerosol forcing, in 2011, of -1.84W/m2.
I think that, if it is easy to conciliate a 3°C sensitivity with -1.84W/m2, it seems impossible with -0.9W/m2 (the new IPCC mean forcing), maybe a 2°C sensitivity works better.
So, is there another aerosol effect (different of the adjustment) accounted by the models, or other things?

[Response: That file is the result of an inverse calculation in Hansen et al, 2011. You need to read that for the rationale. The forcings in our CMIP5 runs are smaller. – gavin]

Paul Williams:

January 5th, 2013 at 7:56 AM

On the studies of sensitivity based on the last glacial maximum, what reduction in solar forcing is used based on the increased Albedo of the ice-sheets, snow and desert. It doesn’t appear to be outined in the papers.

Jack Wolf:

January 5th, 2013 at 9:31 AM

This is off topic, but I was wondering about the Alaska earthquake this morning and its impact on the methane hydrates along the continental shelf. Info on this would be helpful.

Dan H.:

January 5th, 2013 at 12:07 PM

Geoff,
My bet would be the opposite. Historically, a new low sea ice extent (area) is set every five years, with small recoveries in-between. My bet would be that 2012 was an overshoot, and that the next three years will show higher extents and areas. The next lower sea ice will occur sometime thereafter.

Lennart van der Linde:

January 5th, 2013 at 5:35 PM

Looking again at Hansen’s submitted paper leaves me guessing his earth system sensitivty in the current state a little more than 5 degrees C, more like 6-8 degrees. Any other interpretations?

Jim Larsen:

January 5th, 2013 at 7:06 PM

26 Paul W asked, “On the studies of sensitivity based on the last glacial maximum, what reduction in solar forcing is used based on the increased Albedo of the ice-sheets, snow and desert. It doesn’t appear to be outined in the papers.”

Yes, the obvious questions that make the most sense are often missing. What’s the total watts/m2 of the initial orbital push from LGM to HCO (totally silent on this), and what’s the total increase in temperature (4-6C?)?

Combine the two and you’ve got a total system sensitivity for conditions during an ice age. I’ve heard that sensitivity for current conditions is probably higher, but regardless, isn’t that the first thing one would want answered about climate sensitivity?

1. What was the initial push historically?
2. What was the final result (pre-industrial temps)?
3. What is the current push?

RC often touches on the last two, but the answer to the all-important first question is rarely (if ever – I don’t ever remember seeing an answer) mentioned even though it seems to be the best way to derive some sort of prediction about the future that doesn’t rely on not-ready-for-prime-time systems.

Has anybody ever heard of an estimate of the initial orbital forcing from LGM to HCO?

January 5th, 2013 at 8:20 PM

#28–Dan H wrote:

My bet would be the opposite. Historically, a new low sea ice extent (area) is set every five years, with small recoveries in-between. My bet would be that 2012 was an overshoot, and that the next three years will show higher extents and areas. The next lower sea ice will occur sometime thereafter.

Maybe. But didn’t we have a conversation here on RC, not so long ago, about the virtues and vices of extrapolation?

I’m looking at the winter temps from 80 N this year (continuing toasty, relatively), and thinking about ENSO–neutral is now favored through spring–and remembering a) that the weather last year was rather unremarkable for melt and b) we’re still at the height of the solar cycle, more or less.

Throw in a quick consult with some chicken entrails, and I’ve concluded that I wouldn’t bet on Dan’s extrapolation.

Lennart van der Linde:

January 6th, 2013 at 4:20 AM

Jim Larsen #30,
I think Jim Hansen mentions the initial orbital forcing for glaciation-deglaciation to be less than 1 W/m2 averaged over the planet, maybe just a few tenths of a W/m2. The resulting slow GHG and albedo feedbacks are about 3 W/m2 each, in his calculation.

So what happens if the initial GHG forcing now is about 4 W/m2? Would that mean slow feedbacks would total tens of W/m2? Or less? It seems Hansen thinks less, about 4 W/m2 as well, but I don’t really understand why. Does it have to do with the initial orbital forcing being much stronger or effective locally, at the poles?

It seems to me Hansen is really still struggling to understand this himself, and as a consequence his papers are not fully clear yet. Or maybe I just don’t understand clearly enough myself.

You say- “Does it have to do with the initial orbital forcing being much stronger or effective locally, at the poles?”

The northern hemisphere has much more land than the southern hemisphere and they are therefore affected differentially by orbital forcing.

Steve

January 7th, 2013 at 5:23 AM

Dan H #28: I wouldn’t be so sure that 2012 is an outlier. Look at the second animation here. The last few years show ice thickness consistently below previous levels. The pattern is oscillation with a downward trend but around 2007, the previous record year for minimum extent, there’s a big drop, then another one in 2010. With increasingly less multi-season ice, rebuilding previous sea ice extent gets harder and harder. I’m sure some people also thought 2007 was an outlier.

Of course you could be right that there’s some oscillation before it dips again, but I wouldn’t bet on it. There was nothing special about 2012 conditions to have caused a big dip (e.g. SOI index didn’t show any big El Niño events over the year).

Dan H.:

January 7th, 2013 at 7:23 AM

Kevin,
Fair enough. However, looking at the decrease in sea ice minimum over the past decade or so, both the 2007 and 2012 minima crashed through the preious lows in a typical overshoot pattern. Recently, a new low has been set every five years (2002, 2007, & 2012), with modest recoveries in-between. Last year, looks remarkably similar to 2007.

Paul Williams:

January 7th, 2013 at 8:50 AM

Doesn’t -3.5 W/m2 from the ice age Albedo forcing seem like an awfully low figure.

The Arctic sea ice melting out above 75N would have almost no impact at all if that is the forcing change of glaciers down to Chicago and sea ice down to 45N (at lower latitudes where the Albedo has much more impact).

Paul S:

January 7th, 2013 at 12:26 PM

Paul Williams,

I can’t tell where you got the figure but -3.5W/m2 is about right for current understanding of “boundary condition” land albedo change between pre-industrial and LGM. In LGM simulations land albedo changes are prescribed (at least in regards to ice sheets and altered topography due to sea level; there are feedback land albedo changes) so are a forcing, whereas sea ice is determined interactively by the model climate, so is a feedback in this framework.

Ric Merritt:

January 7th, 2013 at 1:28 PM

Geoff Beacon #19: To answer your question, if you mean you expect the NSIDC to announce a September arctic sea ice minimum below, say 1M sq km, by 2015, I would bet against, but not a huge amount, because of uncertainty.

This is not due to any denialist illness, or any reluctance to put my money where my mouth is. Over the last several years, when irritated by trolls on DotEarth, Joe Romm’s site, or the like, I have repeatedly offered to bet more than my current middle-class salary, indexed to the S&P 500 at time of settling up, on the course of global temperatures over decades. Strangely enough, I never got a serious bite.

But my previous point, which you kinda ignored, was that future expectations, which are of course what folks make bets about, are fundamentally different from pointing out a difference between carefully recorded past expectations and carefully recorded (probably recent) past measurements. I think the conversation is clearer if we keep that straight.

#36–“Last year, looks remarkably similar to 2007.”

Only in terms of the magnitude of the extent drop. But if there’s one thing I’ve learned about watching sea ice melt, it is that it ‘loves’ to confound.

January 7th, 2013 at 2:36 PM

#37–Maybe, but IIRC, I saw an estimate of .7 w/m2 for an ice-free Arctic summer. So, maybe not–though the .7 estimate was probably somewhat of a ‘spherical cow in a vacuum’ deal.

David Lea:

January 7th, 2013 at 3:28 PM

@James Annan (#14). Thanks for posting your paper. I think there is a disconnect between the modeling and paleodata community that is affecting your estimates. The data community (geochemical proxies) would argue that we’ve solidly established the 2.5-3 deg cooling level for the deep tropics during the LGM. The MARGO data is dominated by older foram transfer function estimates, which even its most ardent practitioners would agree do not record tropical changes accurately. This is an important point that is affecting a number of recent estimates of sensitivity using MARGO data.

[Response: David, thanks for dropping by. I take it you mean that the Margo data is resulting in underestimates of climate sensitivity? –eric]

David Lea:

January 7th, 2013 at 4:11 PM

@Eric: Yes, that’s the implication. if you look at Fig. 2 in Hargreaves et al, the observational band for LGM tropical cooling they use, based on MARGO, is -1.1 to -2.5 deg C, equating to a sensitivity of about 2.5 deg. Using an estimate of the mean tropical cooling based on geochemical proxies of 2.5-3 deg would yield a sensitivity closer to 3.5 deg (but perhaps Julia will comment).

David B. Benson:

January 7th, 2013 at 5:21 PM

Paul Williams @37 — The ice sheets become dirtier over time.

January 7th, 2013 at 5:32 PM

Do the ‘older foram transfer function estimates’ make different calculations but using the same original material? Or is this new field data? How did the geochemists come by the ‘geochemical proxies of 2.5-3’ now favored?

David Lea:

January 7th, 2013 at 6:32 PM

@Hank Roberts #45. I believe that the transfer function estimates used in MARGO are based on the traditional method used in CLIMAP, rather than newer approaches. And I also believe it is largely the same data set used in CLIMAP. As for the geochemical data, it is based on Mg/Ca in foraminifera, alkenone unsaturation in sediments and some sparse data from other techniques such as Ca isotopes, clumped isotopes and TEX86. The -2.5 to -3.0 deg cooling value is my subjective estimate based on knowledge of the data and various published compilations. Although LGM oxygen isotope changes cannot be used to independently assess cooling, they provide a useful additional constraint that is difficult to reconcile with a cooling much less than 3 deg.

This helps (I’d heard some of the terms, I’d have to look up all of ’em again as most of what I know is decades out of date)

Please go on at as much length as you have patience for.

January 7th, 2013 at 11:04 PM

> affecting a number of recent estimates
> of sensitivity using MARGO data.

Time to invite all the authors whose work is affected to a barbeque?

How hard is it to revise a paper if the author (or reviewer, or editor) decides this change should be made? Simple, or complicated?

Bill Woolverton:

January 8th, 2013 at 2:08 AM

Dan H #28:
Not sure where you get the idea that a record low extent is set every five years. The previous record to 2007 was in 2005. As I recall the 2007 record resulted from very favourable weather conditions, so it would have been unlikely for another record to be set for several years (and it wasn’t). I think we can say that the 2007 melt made it more likely that another record smashing melt season would occur eventually given the right conditions, and that we can say the same about 2012.