The certainty of uncertainty

A paper on climate sensitivity today in Science will no doubt see a great deal of press in the next few weeks. In “Why is climate sensitivity so unpredictable?”, Gerard Roe and Marcia Baker explore the origin of the range of climate sensitivities typically cited in the literature. In particular they seek to explain the characteristic shape of the distribution of estimated climate sensitivities. This distribution includes a long tail towards values much higher than the standard 2-4.5 degrees C change in temperature (for a doubling of CO2) commonly referred to.

In essence, what Roe and Baker show is that this characteristic shape arises from the non-linear relationship between the strength of climate feedbacks (f) and the resulting temperature response (deltaT), which is proportional to 1/(1-f). They show that this places a strong constraint on our ability to determine a specific “true” value of climate sensitivity, S. These results could well be taken to suggest that climate sensitivity is so uncertain as to be effectively unknowable. This would be quite wrong.

The IPCC Summary For Policymakers shows the graph below for a business-as-usual carbon emissions scenario, comparing temperatures in the 1980s with temperatures in the 2020s (orange) and 2090s (red). The latter period is roughly when CO2 will have doubled under this scenario. The resulting global temperature changes cluster between 2 and 5 degrees C, but with a non-zero probability of a small negative temperature change and long tail suggesting somewhat higher probabilities of a very high temperature change (up to 8 degrees is shown).

We have very strong evidence for the middle range of climate sensitivities cited by the IPCC. But what Roe and Baker emphasize is that ruling out very high sensitivites is very difficult because even the relatively small feedbacks, if they are highly uncertain, can have a very large impact on our ability to determine S.

Paleoclimate data do provide a means to constrain the tail on the distribution and perhaps to show the likelihood of large values of S is lower than Roe and Baker’s calculations suggest. In particular, Annan and Hargreaves (2006) used a Bayesian statistical approach that combines information from both 20th century observations and from last glacial maximum data to produce an estimate of climate sensitivity that is much better constrained than by either set of observations alone (see our post on this, here). Their result is a mean value of deltaT close to 3ºC, and a high probability that the sensitivity is less than 4.5ºC, for a doubling of CO2 above pre-industrial levels. Thus, we emphasize that Roe and Baker’s result do not really tell us that, for example, 11°C of global warming in the next century entury is any likelier than we have suggested previously.

On the other hand, there is a counterpoint to such a comforting result. Roe and Baker note that the extreme warmth of the Eocene — something that has stymied climate modelers — could in principle be explained by not-very-dramatic changes in the strengths of the feedbacks, again because small changes in f can produce dramatic change in S. The boundary conditions for Eocene climate remain too poorly known to include in a formal calculation of climate sensitivity, but at the very least the extreme climate of this time suggests that we cannot readily cut the tail off the probability distribution of S.

It would be wrong to think that climate scientists have been ignorant of the non-linear nature of feedbacks on climate sensitivity. Several papers dating back a couple of decades show essentially the same result (for example, Hansen et al., 1984; Schlesinger, 1988; see below for full citations). But Roe and Baker’s paper is probably the most succinct and accessible treatment of the subject to date, and is a timely reminder of some very basic points that are not always appreciated. For example, it is often assumed that the tail on the distribution of climate sensitivity is due to the large uncertainty in some feedbacks, particularly clouds. Roe and Baker make it very clear that this is not the case. (The tail in S results from the probability distribution of the feedback strengths, and unless those uncertainties are distributed very, very differently than the Gaussian distribution assumed by Roe and Baker, the tail will remain). Furthermore, they point out that “uncertainty” in the feedbacks need not mean “lack of knowledge” but may also reflect the complexity of the feedback processes themselves. That is to say, because the strength of the feedbacks are themselves variable, the true climate sensitivity (not just our ability to know what it is) is inherently uncertain.

What will get the most discussion in the popular press, of course, are the policy implications of Roe and Baker’s paper. Myles Allen and David Frame take a stab at this in their Perspective.* Their chief point is that it is probably a bad idea to assign a specific threshold value for CO2 concentrations in the atmosphere, above which “dangerous interference in the climate system” may result. For example, 450 ppm is an oft-cited threshold since this keeps deltaT below 2°C using standard climate sensitivities. But the skewed nature of the distribution of possible sensitivities means that it is much more likely that 450 ppm will give us more than 4.5°C of global warming rather than less than 2°.

Allen and Frame suggest that the way to address this is though an adaptive climate change policy, in which there are movable CO2 concentration targets that can be revised downwards if future observations suggest that the climate sensitivity is indeed greater than the middle IPCC range. We agree that adaptive policies are needed. There is no point in continuing to pursue a 450 ppm stabilization goal in the eventuality that temperatures have already exceeded the expected 2 deg C. More reductions would be called for. Similarly, if temperature rises more slowly than expected, that would buy time. However, in our view, Allen and Frame’s discussion turns the precautionary principle on its head by implying that downward revision can always be done later, after more data are in. But a good adaptive strategy depends on nimble action and forward thinking — both of which are typically in short supply. If reactions to a worse-than-expected climate change are delayed, they make an overshoot of any temperature target very likely, and corrective action very expensive. Thus conservative strategies would seem in order, which probably implies initial targets of much lower than 450 ppm, and still subject to further revision.

The bottom line is that climate sensitivity is uncertain, but we can pretty much rule out low values that would imply there is nothing to worry about. The possibility of high values will be much harder to rule out. This is something policy makers should recognize and confront.

251 Responses to “The certainty of uncertainty”

I myself was taken aback by the Allan Frame discussion to the effect that we can lowball the estimate in confronting climate change and always revise it upwards if needed. Particularly since 3 C is so well supported by Annan and Hargreaves (2006).

One thing I am curious about within this context: Hansen has been using a value of 3 C at least since 1993. However, this is what he calls the short-run Charney climate sensitivity and argues that in the long-run given feedbacks from elements in the climate system which are treated in the short-run as boundary conditions, the long-term climate sensitivity is more like 6 C.

I quote:

Hansen et al. (1993) calculated the ice age forcing due to surface albedo change to be 3.5 +/- Wm^-2. The total surface and atmospheric forcings led Hansen et al. (1993) to infer an equilibrium global climate sensitivity of 3 +/- 1C for doubled CO2 forcing, equivalent to 3/4 +/- 1/4 CW^-1 m^-2. This empirical climate sensitivity corresponds to the Charney (1979) deﬁnition of climate sensitivity, in which ‘fast feedback’ processes are allowed to operate, but long-lived atmospheric gases, ice sheet area, land area and vegetation cover are ﬁxed forcings.

pg 1929

—

Climate sensitivity with surface properties free to change (but with GHG speciﬁed as a forcing, a choice relevant to the twenty-ﬁrst century) is deﬁned in ﬁgure 1, which reveals Antarctic temperature increase of 3 C (Wm^-2)^-1. Global temperature change is about half that in Antarctica, so this equilibrium global climate sensitivity is 1.5 C (Wm^-2)^-1, double the fast-feedback (Charney) sensitivity.

Additionally it would seem that so long as one is dealing with only the short-run, in principle at least, climate sensitivity should be well-defined as one is dealing principally with the feedback from water vapor and sea-ice. One does not have to worry about instabilities associated with ice sheets, feedback from the carbon cycle (even though this would seem to already be coming into play), or instabilities associated with ocean circulation.

In Annan et al, much of their analysis was dealing with fast feedbacks, but some mention of comparisons between the most recent ice age and today were mentioned which would suggest that long-term feedbacks were being included. In particular it would appear to involve ice sheets.

Is this consistent with Jim Hansen’s analysis? Likewise, is it possible that Roe and Baker are in some way blurring the distinction between short-term and long-term? At a deeper level, what do you see as being the relationship between the three analyses?

[Response: Thanks for the thoughtful comment. My response probably won’t go into the detail that you want, I’m afraid, though others may want to chime in. But I can answer some of your questions. Roe and Baker, I would say, are not really making this distinction at all. As you correctly point out, the longer-term effects of ice sheets, etc. can amplify things further, but this could be said to be included in the uncertainty in f in Roe and Baker. (Though ice sheets are an example of a very non-gaussian distribution in f). Annan and Hargreaves analysis effectively looks at the short term only, since they treat the ice albedo as a forcing, not a feedback. –eric]

Unfortunately, in the policy realm there is often a trade off between efficiency and flexibility. Firms want a fair degree of price certainty in long-term investment decisions, and the allocation of carbon-based assets (like permits) make a reduction of permits in circulation in response to reductions in scientific uncertainty somewhat problematic.

In general, its much easier to loosen a climate policy than tighten it.

“(…) initial targets of much lower than 450 ppm (…)” Er, the current level of 381 ppm is only 15% lower than that, and is it reasonable to call it “much lower”? Doesn’t seem like it…

Re #1: Timothy, if I’m reading Hansen correctly that extra 3C is more or less a one-time pulse associated with ice sheet melt; IOW sensitivity returns to 3C after they’re gone. What’s not clear from my admittedly inexpert reading of the paper is whether that transient sensitivity goes away (with the ice sheets) with that initial 3C. I’d love an answer to that.

I fail to see how ‘uncertainty’ supports the skeptics position. We have done a great deal to elimiante low-end climate sensitivty- and it seems we can attribute ~3 C per 2x CO2 with high confidence (anything lower than 2 now seems very unlikely). In the political arena, things like natural variablity and uncertainty are introduced quite often, but in Science, the logic is reversed: if the past is more variable than we think, or the more uncertainty in higher-end projections, the more worry for the future. It appears that real world observations and not the wishful thinking of “uncertainty” shows that if anything, problems are arising faster than anticipated. With a highly variable past, and with the uncertanties in feedbacks and tipping points, you can hope for an exact cancellation of human effects, but you can also fear a great amplification. The CO2 physics is easy, the water vapor is pretty easy, we still have more to understand about aerosols and clouds or ocean circulation, but you can’t just say this means there is no problem- it doesn’t follow.

Really, such efforts to hide behind what we don’t know (implying we know nothing) demonstrate the intellectual bankruptcy of those who say ‘do nothing.’ This paper from RC’s own Ray Pierrehumbert shows some of the things at risk. A more-variable hence higher-feedback world would indicate bigger future changes, and this is nothing trivial, so bashing climate models or some things that still need to be worked out won’t make the AR4 WG2 report on impacts go away.– Chris

Present CO2 levels are about 384 ppmv, the present rate of increase is about 2 ppmv/yr, if the present rate remains constant we’ll hit 450 in just 33 years — right around 2040. But as comment #2 points out, emissions are growing and sinks are reducing.

So, is setting targets to be *less* than 450 at all realistic? Does all of this add to the urgency of limiting carbon emissions?

I am an advocate for setting a much lower goal for CO2 concentrations: 315 ppmv.

This was the value in 1950, when the direct atmospheric measurement sequence began. Also, at that value, the relative forcing would be about 1/3 of the current value (leaving out methane, NOx, aerosols, black carbon, etc.)

Re #1: Timothy, if I’m reading Hansen correctly that extra 3C is more or less a one-time pulse associated with ice sheet melt; IOW sensitivity returns to 3C after they’re gone. What’s not clear from my admittedly inexpert reading of the paper is whether that transient sensitivity goes away (with the ice sheets) with that initial 3C. I’d love an answer to that.

I would agree that the sensitivity, at least as the result of ice sheets will be a one-time afair, but as we are losing ice sheets, the temperature will rise in accordance with the 6 C long-run, so if one doubling takes us up to 6 C and then there are no more forcings, then we will remain at 6 C until the CO2 begins to drop.

6 C is something that I would like to see us avoid at any point, whether it be by the turn of the century or 500 yrs hence.

And I think this issue of climate sensitivity doesn’t consider where those extra GHGs come from and “nature’s sensitivity” to the warming & its many effects. It might be that most come from our direct emissions, but they could also come from nature not uptaking as much as it has been (due to the heat, CO2 overload, and/or all the many effects from global warming and increasing CO2 in the atmosphere, and ocean acidification, and subsidiary effects (from measures that emit GHGs), such as pollution and acid rain harming forests, plants, and soils)….

And/or it could come from nature emitting GHGs as a response to the warming and its many effects (right now I’m thinking wildfires, but there’s also methane from melting permafrost & ocean hydrates).

So there is this other uncertainty to consider — not only climate sensitivity, but nature’s sensitivity to and response climate change, and how much extra GHGs (and thus warming) that might entail.

The climate models did not predict last summer’s ice retreat or anything like it in the near future. The ice melt has been attributed to possible “anomalies” in the atmospheric circulation patterns or changes in the ocean currents. If we plot Arctic Sea Ice anomalies in terms of standard deviations from a baseline, then the combination of the summers of 2005 and 2007 starts to look like a real trend – that the models missed.

Therefore, it is likely that feedback mechanisms are missing from the GCM. Thus, I am skeptical of climate sensitivity numbers developed from the current generation of GCM. If I were making policy decisions that affected billions of people for generations to come, I would apply a generous safety factor, of at least 2 and maybe 10. If I wore a bow tie, (the low risk option), I might use a safety factor of 20. That implies putting the brakes on greenhouse gas emissions now! That means putting the brakes on HARD!

Consider the rapid increase in the number of moulins on Greenland over the last few years. Consider the rain events across broad swaths of Greenland last summer. Those rain events transfered heat from the ocean to the ice. The moulins allow that heat to be transferred rapidly into the depths of the ice. (This is a heat transfer that is not in the GCM.) My experience is that when ice gets rained on, it falls apart and slides down the hill.

Do not worry about the economic impacts of putting the brakes on greenhouse gas emissions. Soon, (sooner than Al Gore dreams), we will have a problematic episode of sea level rise, and all the costs of reducing greenhouse gas emissions will seem trivial. However, by then, we will be at higher levels of emissions, there will be more panic, and the costs of abruptly reducing greenhouse gas emissions will be much higher.

#14 Yes but precisely, this agreement is rather empirical, a kind of prior assumption that Roe and Baker try to avoid in their paper. After all, there’s also something like a general agreement from paleoclimate that 2xCO2 will not provoke a ∆T of 15°C. But if I see fig.1 in the paper, it’s an analytical eventuality if f -> 1. So, I dont’ clearly understand why the low range of the ∆T is not 1°C (f = 0, so just the ∆T due to 2xCO2 without feedback) or even less (if f is negative for unknown reasons in our present understanding of climate). But I’m going to read more carefully the paper, there’s probably the answer.

Actually, from the point of view of risk management, one multiplies probability by the cost of the scenario. Since cost probably increases quite super-linearly with temperature, it is possible that the tails of the distribution could dominate risk even though they are quite improbable. In this case, the appropriate course would be to spend considerable effort to nail down the feedbacks while concentrating mitigation efforts on the more probable outcomes. A flexible system such as cap and trade would likely be essential if we found that the worst-case feedback values were more probable than originally anticipated. A very interesting risk scenario.

Would it be correct to say, then, that, for infinitessimal changes, in the absense of sharp thresholds, the climate sensitivity tends to be predictable as the climate itself is more similar, but as changes get larger, the sensitivity may grow or shrink; and that the climate must change significantly before significant sensitivity shrinkage would be expected (if it were to shrink), which is why it is easier to bracket the low end than to bracket the upper end, where more change leads to more change in sensitivity which amplifies change even more?

RealClimate’s conclusion seems a good summary for policy-makers of Roe and Baker’s work: “climate sensitivity is uncertain, but we can pretty much rule out low values that would imply there is nothing to worry about. The possibility of high values will be much harder to rule out.”

It is a pity that Roe and Baker did not say something as clear as that in their article because it leaves their work open to abuse by people trying to undermine the momentum for major policy changes.

Allen and Frame’s suggestion that we should “resist the temptation to fix a concentration target early on” because “Once fixed, it may be politically impossible to reduce it” seems hard to reconcile with their faith in “our descendants [having] the sense to adapt their policies to the emerging climate change signal …”. They seem to think our descendents will act sensibly but we will not.

Re #16, Roe and Baker don’t address why climatologists feel f to be positive. But when all factors are accounted for, the net result is positive, or in the same direction as the change. Examples include:

#15, Aaron is correct about sensitivity doubts in the present tense.
Current 2007 Northern Hemisphere temperature anomalies were riding very warm until the great Ice melt, I won’t be surprised to see a return of very strong positive anomalies starting with this October’s NH Monthly result, as the Polar ice sheet returns to its nearly full but quite thinner extent. Energy transfers between sea, air and the cryosphere may be ill defined by a simple sensitivity of uniquely the lower troposphere. I would agree with the assessment given here that sensitivity will exist at a higher rather than lower figure, because it is already quite strong, only if it shows up near the ground.
Sensitivity variances should be norm not the exception.

Will the anomaly keep growing? I suspect that there is a fair change that the anomaly will mostly continuing to grow, perhaps until the Arctic is almost sea ice free. If so, the Arctic might be almost sea ice free by next summer, or the summer after that.

Gerard Roe was interviewed on BBC News today and I wasn’t impressed – what he said was accurate, but imagining myself into the mind of a “sceptic-in-the-street”, I would have certainly have got the impression from the interview that the science was far too uncertain to justify taking any action at all to reduce emissions. Another case, IMO, of a scientist not being able to imagine how an ordinary layman is likely to interpret what they say, and how to communicate with the public in language that they will not misinterpret.

Ref 20 Phil writes “Will the anomaly keep growing? I suspect that there is a fair change that the anomaly will mostly continuing to grow, perhaps until the Arctic is almost sea ice free. If so, the Arctic might be almost sea ice free by next summer, or the summer after that.”
According to NOAA/NSIDC the amounts of arctic sea ice in different months was as foillows:-
March 2006 14.4 million sq kms
March 2007 14.7 million sq kms
September 2006 5.9 million sq kms
September 2007 4.3 million sq kms

Anyone like to put their name on what future values will be? I will try and keep the values and names and post them at an appropiate time in the future. My two guesses are as follows. March 2008 14.2 million sq kms. September 2008 4.8 million sq kms.
It should be remembered that in 2005 the accumulated cyclone energy (ACE) value in the North Atlantic was around 250. In 2006 it was around 70, and this year to date is around 60. Will history repeat itself?

A dramatic decline in the ability of the Earth to soak up man-made emissions of carbon dioxide, and a corresponding acceleration in the rate of increase of greenhouse gas in the atmosphere, have been detected for the first time by scientists.

This two pronged feedback must be very difficult to model accurately and the oft stated view of RC that delta is 3 C does not tell us when this is going to happen.

I know that it is a left wing peice and people referenced in the speice may have been misquoted but it is lending itself to the notion that the present warming is coming forwards by decades.

The worst thing about AGW is the rate of change which is putting unprecedented stress on earths systems to deal with humans CO2 emissions. Surely there are feedbacks everywhere and the non linear nature is likely to be greater if the rate of change is?

Re #20. I recently had an email exchange with Bill Chapman at Cryosphere Today about the likely change in the anomaly over the winter. Without posting his email on this public site, I’ll summarize. Basically he confirmed my amateur guess: that the arctic winter is invariably long enough and cold enough to freeze essentially all of the surface water of the arctic ocean. The onset of this freezing has been delayed and slowed by the unusual warmth of the water, which is why the anomaly has increased since the start of winter. But it is coming. So he’s expecting the anomaly to rapidly become less negative, within the next month or so.

So the ocean is going to skin over with ice this winter, as it always does. This is an inevitable consequence of the axial tilt: the only way the ocean could stay unfrozen through February would be for it to start the winter very warm indeed. Thus the winter maximum area is thought to be much less sensitive than the summer minimum area.

However, unless this winter is unusually cold, the ice will be very thin (as it will have had less time to form). So if we have normal melt season weather next year, there will be a large anomaly again. If the melt season is like 2005 or 2007, all bets are off.

I hope I’ve summarized Chapman correctly. In short, expect the anomaly to head back towards 1 million square kilometres very soon, but only for a season.

“albedo decreases as ice melts (ice is perhaps 80% reflective, while ocean albedo can be as low as 3.5%)
• increased water vapor in a warmer climate
• warmer oceans absorb less carbon dioxide
• warmer soils release carbon dioxide and methane
• plants in a hotter climate are darker

Negative feedbacks may include shifts in clouds. ”

not entirely agree with you.

for the oceans there is the possibility of “surface” (100 to 200 m thickness) ocean waters to cool stronger than forecasted.
Think about the wind-driven upwelling and please look at actual SST SH trend.
For the clouds we don’t know the exact amount of future low and high clouds.
There is also a fertilization effect of CO2.

I’m not a denialist but I’m very worried by the certitude of some people here and by the absence of response, from contributors, when there is some unconvenient question.

A quick weigh-in on a minor point that I, a sceptic, wish to clarify. I believe the mathematical uncertainties discussed here neither enhance nor detract from valid scepticism. Uncertainties and probabilities are simple facts of science (they can be consciously manipulated, but I don’t see that here). I will admit that some sceptics might jump on the changing uncertainties, but I can’t help that. I am a little chagrined that some here desperately want to take only the worst case (like some sceptics want only the “best” case) and make that gospel or even worse, to the point of “over reaction” to be on the “safe” side as two or three here have implied, and to the point of not disclosing the true stuff which only give sceptics fodder, as Hudson (12) and Dave (21) imply. I don’t think the uncertainties per se or even the changing uncertainties make or break the science of AGW in any way, though might have an effect on what to do, ala Ray’s #17.

The climate models did not predict last summer’s ice retreat or anything like it in the near future. The ice melt has been attributed to possible “anomalies” in the atmospheric circulation patterns or changes in the ocean currents. If we plot Arctic Sea Ice anomalies in terms of standard deviations from a baseline, then the combination of the summers of 2005 and 2007 starts to look like a real trend – that the models missed. … Do not worry about the economic impacts of putting the brakes on greenhouse gas emissions. Soon, (sooner than Al Gore dreams), we will have a problematic episode of sea level rise. …”

I guess I don’t understand how a climate model could reflect a linear expectation for centuries and also contain a trigger for a nonlinear collapse within the timeframe of the organizer on Al Gore’s Blackberry.

“The researchers are examining what caused the rapid decrease in the perennial sea ice. Data from the National Centers for Environmental Prediction, Boulder, Colo., suggest that winds pushed perennial ice from the East to the West Arctic Ocean (primarily located above North America) and significantly moved ice out of the Fram Strait, an area located between Greenland and Spitsbergen, Norway. This movement of ice out of the Arctic is a different mechanism for ice shrinkage than the melting of Arctic sea ice, but it produces the same results – a reduction in the amount of perennial Arctic sea ice.”

“Nghiem cautioned the recent Arctic changes are not well understood and many questions remain. “It’s vital that we continue to closely monitor this region, using both satellite and surface-based data,” he said.”

Who knew? Could someone please point me to the literature on AGW causing more wind.
Thanks.

That’s a known reversing pattern — positive ACRI phase characterized by cyclonic ocean circulation and a warmer and wetter climate.

Like El Nino/La Nina and much else in the climate system, these go back and forth.

What happens when the average temperature matches what used to be the intermittent extreme temperature?

In the polar regions — where warming happens fastest — what happens more often? It’s warmer and wetter. It was raining near the North Pole when the Polarstern icebreaker got closest to the Pole during the summer. Does the wind pattern change along with the temperature? Let’s see.

You ask who knew? The climate scientists.
Who had no clue? The people trying to pretend nobody knew.

Further to Aaron’s post in #15, if the current generation of GCM’s do not properly include ice sheet dynamics and interactions with the oceans etc, are not the pdf’s and their moments compromised and if so to what extent? Given such incompleteness (effectively a Gibbs-type phenomenon), what level of robustness or convergence can be ascribed to the parameter which people refer to as climate sensitivity? Furthermore, what physical meaning can be given to such a parameter given the incomplete spanning set for the overall system?

for the oceans there is the possibility of “surface” (100 to 200 m thickness) ocean waters to cool stronger than forecasted.

Think about the wind-driven upwelling and please look at actual SST SH trend.

For the clouds we don’t know the exact amount of future low and high clouds.

There is also a fertilization effect of CO2.

Agreed: there are a few negative feedbacks. And it is certainly worthwhile to point them out. But I would be careful, too, as what is a negative feedback may also be feeding into a positive feedback and as a result of its indirect effects may on the whole be more positive than negative.

For example, as the tropics become warmer, there is more poleward circulation in both the ocean and atmosphere. Negative feedback? Well, this cools the tropics and cuts in to the potential for a super greenhouse effect where the rate of downwelling longwave increases relative to surface temperature more rapidly than upwelling longwave.

But it also means that more ice is going to melt, and with the albedo effect that is a negative feedback feeding into a positive feedback. However, once the ice is melted it will no longer be able to feed into the albedo effect. So it is quite possible that the net effect is currently positive but will later become negative.

Alright, how about winds resulting in the upwelling of deep water and the downwelling of surface water? Insures that things won’t warm topside as quickly, therefore it is a negative feedback. However, in the Antarctic Ocean this is bringing up organic material which releases both carbon dioxide and methane.

It is precisely this upwelling which has resulted in the recent diminished capacity of the Antarctic Ocean to absorb as much of our carbon emissions. So at least in the case of the Antarctic Ocean (the main door to the biggest sink for carbon dioxide our climate system has) it would appear that this may very well be a net positive feedback – at least for the time being.

CO2 fertilization?

Well, that seems to have worked for a while, but as temperatures rise due to higher CO2 concentrations plants become subject to both heat and drought stress, and so we have that sink working less well than it has in the past — which is a feedback. Besides, I am not really sure that we would have ever considered CO2 fertilization to be a negative feedback in as much as it would have simply meant that less CO2 was building up and therefore couldn’t act as a climate forcing.

I believe the way that they would have handled it (although I could very well be wrong) is the assumption that so much of the carbon which we emit expressed as a percent will be taken up by that sink – prior to any climate forcing/feedback analysis. However, when that sink begins to become less effective, the percent falls, and the diminished capacity to absorb our carbon emissions would be counted as a feedback.

#31 Not winds as much as advection of much warmer air, not only on the surface but also in the Upper Air, exactly where AGW affects the atmosphere.

#30, The non linear element is found in transferance of energies in the three main bodies at play, water, air and ice (with snow). It is predictable if the models include all three.

#26 Lynn, Its a big problem, or failure when science literature is mangled according to point of views driven by special interests. I think a paper such
as from Roe and Baker is designed for other scientists to mull over the merits of
using sensitivity as a benchmark. It would be convenient for contrarian politicians to interpretret sensitivity estimates to design policy, since it is so uncertain there would be no policy….

Hank,
could you please point me to where you got the information about a positive ACRI phase. When I went tohttp://www.arctic.noaa.gov/essay_bond.html
there was no information as to ACRI phases. Perhaps, you meant the AO is in a positive phase, in which the site says,”The Arctic Oscillation (AO) appears to be the cause for much of the recent changes that have occurred in the Arctic. Its effects are not restricted just to the Arctic; it also represents an important source of variability for the Northern Hemisphere as a whole. The AO has been described as “a seesaw pattern in which atmospheric pressure at polar and middle latitudes fluctuates between positive and negative phases. The negative phase brings higher-than-normal pressure over the polar region and lower-than-normal pressure at about 45 degrees north latitude. The positive phase brings the opposite conditions, steering ocean storms farther north and bringing wetter weather to Alaska, Scotland and Scandinavia and drier conditions to areas such as California, Spain and the Middle East.”” Of course hasn’t the AO been in a positive phase since around 1980.

Hank states,”In the polar regions — where warming happens fastest — what happens more often? It’s warmer and wetter.”
This sentence is not very clear, although I will agree where it is warmer, warming happens more often, I am not sure that you can definatively state “polar regions” since readings from antarctica and the arctic seem to be heading in opposite directions.

Hank states,”Does the wind pattern change along with the temperature? Let’s see.”
Careful, that sounds a lot like weather, pardon me, noise. Does the “let’s see” indicate that nobody knows and we will just have to wait and see? Or was that just an incomplete thought? Of course, I expect that there is some literature out there that makes the connection between wind and AGW, and after all that is all I asked for in the first place.
Thanks again.

Thank you Wayne #35, however, from reading the link I provided in #31, I think it is pretty clear that they are talking about the actual force of the wind pushing the ice to warmer climes, and not the winds being warmer.

Re 33
These models are science tools. If we want to use them for engineering or planning, or policy development, we should apply safety factors. Planners, engineers, and policy makers are going to have to build new tools that are informed by the science, but are not THE science. Engineers need tools that meet the needs of engineers. Planners need tools that meet the needs of planners. Why should a tool that scientists built for themselves be suitable for planners, engineers, and policy makers? That was not part of the design basis. Nowhere in funding documents does it say, “Build a tool that does everything for everybody!”

Engineers, engineers and policy makers need to be familiar with the science, but they need to do their own jobs and let the scientists do what scientists do.

And, note that planners and engineers use many safety factors that do not have physical meaning. They can do that because it meets their needs. It is how engineered systems are planned and designed. However, it is not how science is done.

Wayne #38
nice catch. I gave the wrong URL, try,http://www.jpl.nasa.gov/news/news.cfm?release=2007-112
for the updated article. For the record the ice is gone and no one should dispute that. My only reason for bringing this article up is to find AGW literature regarding its effects on wind.

[Response: Try this, and subsequent papers, Miller et al, 2006 for instance. It’s not certain, but there are indications that one should expect a more positive phase AO. With respect to the paper talked about in the release you link, read the full paper: http://www.agu.org/pubs/crossref/2007…/2007GL031138.shtml – you will note that the dynamic impact of the wind is not the exclusive cause of this year’s anomaly. As indeed you would expect, especially since the winds were even more favorable for ice export in the early 90’s. It wasn’t as warm back then…. – gavin]

Further to my post #39 (which was actually referring to #29 and not #21 as stated, sorry):

As an example of what I meant, compare Ken Caldeira’s op-ed with the comment he posted on Realclimate. I much preferred his comment here to his op-ed, but his comment here didn’t disclose any less than his op-ed did – quite the reverse, in fact. The point is that his comment here is far less prone to being misinterpreted and misrepresented than his op-ed is, and that was also my concern about the interview with Gerard Roe.

Good post and comments, and it’s very timely to see this topic raised again. The long upper tail is deeply worrying, particularly as there seems to be no let up in the rate of increase in the concentration of CO2. To my mind, there is little prospect of CO2 peaking at less than 600 ppmv, which is well over the ‘targets’, and the targets themselves may already be above what is reasonably live-able with.

Regarding the possibility of an ice-free (summer) Arctic, I’d still be interested to see some modelling of the submarine ice melt, which could be considerable but spatially confused, setting up all sorts of changes in the deeper ocean circulation. How these changes might feed into the Atlantic (mainly), I don’t know, but I don’t doubt we’ll find out soon enough. One possibility is that the deep melt is brought to the surface somehow, in which case the surface waters actually get colder again, leading to much greater summer ice rather than less. No doubt the ‘skeptics’ would be pleased by such an ‘anomoly’, using it as further evidence that the climate models are wrong. Another possibility might be a slowing of deep circulation (not sure how much there is, mind), in which case the opposite occurs, and the surface waters heat up even faster, leading to yet more rapid surface melt, smaller winter ice volumes and so on.

Re #41: On a conceptual level, I think a lot of people imagine that as the atmosphere warms that somehow the climate system will respond in place. As demonstrated in the abstract pasted below, we are pushing on the climate system and it is indeed moving. That distance works out to about 200 miles for each hemisphere, BTW, so this is not a small change. As the tropics expand, the poleward portions of the climate system come under pressure. Are we playing dominos blindfolded?

Recent widening of the tropical belt: Evidence from tropopause observations

Radiosonde measurements and reanalysis data are used to examine long-term changes in tropopause behavior in the subtropics. Tropopause heights in the subtropics exhibit a bimodal distribution, with maxima in occurrence frequency above 15 km (characteristic of the tropical tropopause) and below 13 km (typical of the extratropical tropopause). Both the radiosonde and reanalysis data show that the frequency of occurrence of high tropopause days in the subtropics of both hemispheres has systematically increased during the past few decades, so that tropical characteristics occur more frequently in recent years. This behavior is consistent with a widening of the tropical belt, and the data indicate an expansion of about 5–8° latitude during 1979–2005.