Good news for the earth’s climate system?

How much additional carbon dioxide will be released to, or removed from, the atmosphere, by the oceans and the biosphere in response to global warming over the next century? That is an important question, and David Frank and his Swiss coworkers at WSL have just published an interesting new approach to answering it. They empirically estimate the distribution of gamma, the temperature-induced carbon dioxide feedback to the climate system, given the current state of the knowledge of reconstructed temperature, and carbon dioxide concentration, over the last millennium. It is a macro-scale approach to constraining this parameter; it does not attempt to refine our knowledge about carbon dioxide flux pathways, rates or mechanisms. Regardless of general approach or specific results, I like studies like this. They bring together results from actually or potentially disparate data inputs and methods, which can be hard to keep track of, into a systematic framework. By organizing, they help to clarify, and for that there is much to be said.

Gamma has units in ppmv per ºC. It is thus the inverse of climate sensitivity, where CO2 is the forcing and T is the response. Carbon dioxide can, of course, act as both a forcing and a (relatively slow) feedback; slow at least when compared to faster feedbacks like water vapor and cloud changes. Estimates of the traditional climate sensitivity, e.g. Charney et al., (1979) are thus not affected by the study. Estimates of more broadly defined sensitivities that include slower feedbacks, (e.g. Lunt et al. (2010), Pagani et al. (2010)), could be however.

Existing estimates of gamma come primarily from analyses of coupled climate-carbon cycle (C4) models (analyzed in Friedlingstein et al., 2006), and a small number of empirical studies. The latter are based on a limited set of assumptions regarding historic temperatures and appropriate methods, while the models display a wide range of sensitivities depending on assumptions inherent to each. Values of gamma are typically positive in these studies (i.e. increased T => increased CO2).

To estimate gamma, the authors use an experimental (“ensemble”) calibration approach, by analyzing the time courses of reconstructed Northern Hemisphere T estimates, and ice core CO2 levels, from 1050 to 1800, AD. This period represents a time when both high resolution T and CO2 estimates exist, and in which the confounding effects of other possible causes of CO2 fluxes are minimized, especially the massive anthropogenic input since 1800. That input could completely swamp the temperature signal; the authors’ choice is thus designed to maximize the likelihood of detecting the T signal on CO2. The T estimates are taken from the recalibration of nine proxy-based studies from the last decade, and the CO2 from 3 Antarctic ice cores. Northern Hemisphere T estimates are used because their proxy sample sizes (largely dendro-based) are far higher than in the Southern Hemisphere. However, the results are considered globally applicable, due to the very strong correlation between hemispheric and global T values in the instrumental record (their Figure S3, r = 0.96, HadCRUT basis), and also of ice core and global mean atmospheric CO2.

The authors systematically varied both the proxy T data sources and methodologicalvariables that influence gamma, and then examined the distribution of the nearly 230,000 resulting values. The varying data sources include the nine T reconstructions (Fig 1), while the varying methods include things like the statistical smoothing method, and the time intervals used to both calibrate the proxy T record against the instrumental record, and to estimate gamma.

Some other variables were fixed, most notably the calibration method relating the proxy and instrumental temperatures (via equalization of the mean and variance for each, over the chosen calibration interval). The authors note that this approach is not only among the mathematically simplest, but also among the best at retaining the full variance (Lee et al, 2008), and hence the amplitude, of the historic T record. This is important, given the inherent uncertainty in obtaining a T signal, even with the above-mentioned considerations regarding the analysis period chosen. They chose the time lag, ranging up to +/- 80 years, which maximized the correlation between T and CO2. This was to account for the inherent uncertainty in the time scale, and even the direction of causation, of the various physical processes involved. They also estimated the results that would be produced from 10 C4 models analyzed by Friedlingstein (2006), over the same range of temperatures (but shorter time periods).

So what did they find?

In the highlighted result of the work, the authors estimate the mean and median of gamma to be 10.2 and 7.7 ppm/ºC respectively, but, as indicated by the difference in the two, with a long tail to the right (Fig. 2). The previous empirical estimates, by contrast, come in much higher–about 40 ppm/degree. The choice of the proxy reconstruction used, and the target time period analyzed, had the largest effect on the estimates. The estimates from the ten C4 models, were higher on average; it is about twice as likely that the empirical estimates fall in the model estimates? lower quartile as in the upper. Still, six of the ten models evaluated produced results very close to the empirical estimates, and the models’ range of estimates does not exclude those from the empirical methods.

Figure 2. Distribution of gamma. Red values are from 1050-1550, blue from 1550-1800.

Are these results cause for optimism regarding the future? Well the problem with knowing the future, to flip the famous Niels Bohr quote, is that it involves prediction.

The question is hard to answer. Empirically oriented studies are inherently limited in applicability to the range of conditions they evaluate. As most of the source reconstructions used in the study show, there is no time period between 1050 and 1800, including the medieval times, which equals the global temperature state we are now in; most of it is not even close. We are in a no-analogue state with respect to mechanistic, global-scale understanding of the inter-relationship of the carbon cycle and temperature, at least for the last two or three million years. And no-analogue states are generally not a real comfortable place to be, either scientifically or societally.

Still, based on these low estimates of gamma, the authors suggest that surprises over the next century may be unlikely. The estimates are supported by the fact that more than half of the C4-based (model) results were quite close (within a couple of ppm) to the median values obtained from the empirical analysis, although the authors clearly state that the shorter time periods that the models were originally run over makes apples to apples comparisons with the empirical results tenuous. Still, this result may be evidence that the carbon cycle component of these models have, individually or collectively, captured the essential physics and biology needed to make them useful for predictions into the multi-decadal future. Also, some pre-1800, temperature independent CO2 fluxes could have contributed to the observed CO2 variation in the ice cores, which would tend to exaggerate the empirically-estimated values. The authors did attempt to control for the effects of land use change, but noted that modeled land use estimates going back 1000 years are inherently uncertain. Choosing the time lag that maximizes the T to CO2 correlation could also bias the estimates high.

On the other hand, arguments could also be made that the estimates are low. Figure 2 shows that the authors also performed their empirical analyses within two sub-intervals (1050-1550, and 1550-1800). Not only did the mean and variance differ significantly between the two (mean/s.d. of 4.3/3.5 versus 16.1/12.5 respectively), but the R squared values of the many regressions were generally much higher in the late period than in the early (their Figure S6). Given that the proxy sample size for all temperature reconstructions generally drops fairly drastically over the past millennium, especially before their 1550 dividing line, it seems at least reasonably plausible that the estimates from the later interval are more realistic. The long tail–the possibility of much higher values of gamma–also comes mainly from the later time interval, so values of gamma from say 20 to 60 ppm/ºC (e.g. Cox and Jones, 2008) certainly cannot be excluded.

But this wrangling over likely values may well be somewhat moot, given the real world situation. Even if the mean estimates as high as say 20 ppm/ºC are more realistic, this feedback rate still does not compare to the rate of increase in CO2 resulting from fossil fuel burning, which at recent rates would exceed that amount in between one and two decades.

I found some other results of this study interesting. One such involved the analysis of time lags. The authors found that in 98.5% of their regressions, CO2 lagged temperature. There will undoubtedly be those who interpret this as evidence that CO2 cannot be a driver of temperature, a common misinterpretation of the ice core record. Rather, these results from the past millennium support the usual interpretation of the ice core record over the later Pleistocene, in which CO2 acts as a feedback to temperature changes initiated by orbital forcings (see e.g. the recent paper by Ganopolski and Roche (2009)).

The study also points up the need, once again, to further constrain the carbon cycle budget. The fact that a pre-1800 time period had to be used to try to detect a signal indicates that this type of analysis is not likely to be sensitive enough to figure out how, or even if, gamma is changing in the future. The only way around that problem is via tighter constraints on the various pools and fluxes of the carbon cycle, especially those related to the terrestrial component. There is much work to be done there.

378 Responses to “Good news for the earth’s climate system?”

Thank you, a very interesting study. Still, it is quite hard to project how f.e. a mixed forest will react to say 1,5C and +20ppm [CO2], are the researches supposed to build 30 meter high greenhouses with winter heating/CO2 supplements to get results that can be applied to the future conditions? Should there be many such installations in many climates? Wouldn’t ordinary greenhouses do?

I wonder if anyone can comment on the following, which makes its way through the blogs (DotEarth most recently). It’s confusing me–and I don’t have time to start a new career as a modeler, but need direction on the credibility of the models, especially with respect to signal-noise of CO2 and implications for risk analysis.

Gerlich and Tschneuschner, “Falsifiction of the Atmospheric CO2 Greenhouse Effects Within the Framework of Physics”; Atmospheric and Ocean Physics.

1) Kinetics vs equilibrium. Gamma is presumably an equilibrium value but how long would it take for equilibrium – 800 years is often bandied about for CO2 lag vs temperature whereas this study only seems to apply to the last 1000 years ?

[Response: Gamma can be either a transient or equilibrium value, depending on one’s goals and methods, and can be defined over any desired time lag, because the carbon cycle will operate at a wide variety of temporal scales. The study addresses only those that are operating at late Holocene lags < 80 years.]

2) Are there standard values of gamma in the current climate models and how significantly does this study likely affect the overall range of temperature rises predicted ?

You state that “Gamma has units in ppmv per ºC. It is thus the inverse of climate sensitivity”. It would seem that you simply mean that the DIMENSIONS of gamma are the inverse of those of climate sensitivity. I think that some readers were confused into thinking that climate sensitivity is literally 1/gamma.

[Response: Thanks for clarifying. I meant that the units are the inverse of each other, not the values.]

Science is the rules we “play” by. It cannot “lose”. It does not listen to the likes of Beck or Rush. It doesn’t care about our arts, religion, politics, or, yes, even “our science”. It cannot be spun. We either play by its rules or don’t get to play at all. Right now, I’d say we have a short period of time to decide what its going to be.

The most worrying impact of gamma will be on the long-term climate response to human emissions. As an example, suppose we double pre-industrial CO2 in the next few decades, up to 560ppm. Then the “fast” climate feedbacks (Charney sensitivity) will give us about 3 degrees C of warming over the next century.

So far so bad, but we are nowhere near the end-point. First we must add “slow” feedbacks (mainly albedo changes) which according to Hansen et al will double the Charney sensitivity, so we get 6 degrees. But now we have to consider gamma as well. If it is comparable to the value estimated in this paper (or the rough estimates in comments above), then those 6 degrees give an additional 100ppm of CO2, taking us up to 660ppm.

But we are still not done! That extra CO2 means another ~0.7 C of fast feedback warming, and the same again of slow feedback warming, taking the total warming to 7.5 degrees. And the extra 1.5 degrees then means another ~25ppm CO2 from gamma, inducing further warming still.

Fortunately, this is a converging series: it terminates at about 700ppm and 8 degrees C. But this is still enough to take us back to PETM territory (if not worse). And we haven’t even touched on methane feedbacks.

And remember, all this starts with “just” a doubling of CO2, which seems rather an optimistic scenario under BAU assumptions. Given the Copenhagen fiasco, Climategate, Glaciergate, stalled legislation in Congress, and record snowfalls on top of already skeptical populations, that is exactly what we’ll be getting.

[Response: It would be extremely hard to get to 560 ppm in the next few decades without some major feedback. But you have explained the feedback concept very well. Jim]

I’m lacking background here, but I’d like to better understand the graph. Density is density of what? Thanks if someone has patience for a beginner.

[Response: Density refers to the proportion of the observations that fall within a given interval, i.e. the density of the observations if you will. Sums to one (over all intervals); referred to as a probability density function. Jim]

Re: #37 – Jellyfish seem to become much more dominant in ecological damaged marine and estuarine systems. Maybe we’re sending the earth back to its earlier days. Hotter, more deserts, more swamps and seas full of primitive creatures that aren’t very good to eat.

Thanks for a description of this paper. Its findings are nothing like what I’ve read in the popular press which seem to confuse carbon dioxide’s forcing and feedback roles. News accounts inevitably reported on this paper and the study on stratosphere water vapor levels simultaneously. And then threw in something about the CRU hacks as well. It will all get worked out in the end.

[Response: The media is a very serious problem. Avoid it. Get your info from sites like this, the IPCC, and agency and lab websites, especially NOAA, NASA, USGS, ORNL etc. Jim]

My question: why not study carbon dioxide levels during the Younger Dryas period? We know why temps fell and rose at that time. Aren’t the ice cores a fine enough instrument to determine a CO2 release rate as the globe warmed again?

[Response: The younger dryas requires other T proxies. People are working on longer time periods, including back to the Pliocene (>2.6 mya), using other proxies (e.g. Pagani et al. cited in post), and even further back.]

Thanks, Gavin. If you all on RealClimate ever question your role in setting the record straight for people willing to go the extra mile to understand, don’t. It’s a confusing world in the blogs, and the social scientists, journalists, and policymakers among us need the extra nudge here and there to help confirm our suspicions of disingenuous or sloppy work on the part of some.

Gamma has to do with temperature and its impact on the “live” biosphere & its carbon (T–>CO2), I guess mainly how live plants on land and in the sea respond to temperatures by either drawing down CO2 from the atmosphere when they are florishing (presumably in warmer climates), or releasing CO2 when they are not florishing or when they are dying. So it doesn’t really have to do with remains of already long dead flora/fauna (methane hydrates, fossil fuels, etc).

And the time frame is basically within our geologically recent past of fairly mild climates fluctuating between warm and cold spells, but not necessarily with climates much warmer than today’s climate. So as the frigid higher latitudes warm up and have longer growing seasons (and more sun, since the sun shines longer up north), the plants will do better, drawing down CO2.

Does this take into consideration that in warmer climates than we now have mid-summer heat may kill plants, stymie their growth, and/or dry them out, creating fuel for wildfires? I’m wondering also about the CO2 effects — too much of a good thing might be bad for plants in some ways, such as creating more acidic oceans and soils, ocean eutrophication from too much plantlife, or ??. Just as CO2–>T has to consider T releasing methane in permafrost and hydrates to get a more complete picture, so too the T–>CO2 has to consider CO2 doing other things besides fertilizing plants.

[Response: Yes, you have the concept of gamma correct. The study’s estimates do not account for changing conditions in the future. This includes anything from permafrost thaw to frozen methane to changes in net ecosystem productivity (NEP). The main negative effects of CO2 are elevated temperature and ocean acidification. Fertilization is a generally positive effect, as it is a negative carbon cycle feedback. Jim]

Jim, That was a pretty clear explanation, however a few commenters are still confused about the difference between Gamma, and the Charney sensitivity. For Charney, consider a thought experiment on an earthlike planet: control CO2 levels, and measure the temperature versus CO2 level. Feedback from T to CO2 is eliminated, as our (imaginary) experimental apparatus controls atmospheric CO2. Now to understand Gamma, we do a similar experiment, but we control the planet’s temperature, and allow CO2 to vary. That second very different experiment gives Gamma. Although they have the same units (I think they are recipricals) the two concepts represent different experiments, and hence should not be confounded. Clearly a predictive model of how the system responds to a delta in CO2 inventory requires the use of both constants, both they are both best understood by isolating them.

To the issue of potential permafrost releases of CO2: If the equlibrium concentration of CO2 versus temperature is smooth enough such that a linear approximation over policy relevant values makes sense, then I would expect that this study does cover permafrost feedbacks (i.e. I am saying that even for an infintessimal change of temperature that there will be an infinetessimal change in CO2, and that the derivative makes sense). Now this assumption of linearity, and the limitation of the modeled 80year time lags are big ones.

I would be concerned that perhaps the current and future biosphere is so dominated by man, that historic relationships may no longer be relevant. If the carbon feedbacks come largely from tropical and temperate land areas, I this this concern is valid. If they are dominated by subarctic ecosystems (Boreal forest and tundra), or ocean, perhaps the natural world can still be thought of as being in control.

[Response: Good explanation of the relationships, thanks. Not sure I follow your last P though. Jim]

Anand:
A positive gamma means the two quantities move in the same direction. A negative gamma means they move in opposite directions.

Glen:
No that’s not what they did. They regressed each of the CO2 concentration series on each of the recalibrated, reconstructed temperature series, for a given set of factor levels. The lag was not set at 80 but rather varied from -80 to +80. There was no Monte Carlo simulation involved: it was a full factorial experiment as far as I can tell.

Edward:
Back up a few steps. First, I didn’t write the article, I’m just reporting on it. Second, the authors conclude the article thus: “The convergence of [gamma] computed herein with other more moderate values quantified for interannual to Milankovitch timescales suggests limited timescale dependence and thus reduced possibilities for unwelcome surprises within the next century. However, as this indication is based on pre-industrial CO2 concentrations and temperatures, possible threshold responses, transient behaviour, fertilization effects, and the role and rates of oceanic circulation and uptake and release from peatlands still need reconciliation in observational studies and climate-modelling efforts.” Third, the two time periods used by them was simply to recognize that values varied with time (although this does sort of conflict with their statement above). And fourth, they are not trying to say anything at all about the anthro input since 1800.

William:
Yes, and many other factors as well. The carbon cycle works over all imaginable space/time scales, just like climate does.

ICE:
Interesting point. Larger T variances will indeed minimize gamma–so not sure how Cox and Jones came up with that estimate. Points up the very reason Frank et al. did their study.

L. David Cooke:
I tend to agree David. The land cover change topic is very complex though–e.g. there is a sizeable component of recovering forest land globally to partially offset the reduced buffer capacity loss due to ag.

jyyh:
Separate topic, but yes indeed. If we could only do controlled experiments on climate change effects…but mostly we can’t. There’s a take home message there.

Jim, good explanation of the paper. Given all the BS that some have posted, I was beginning to wonder whether I had read the abstract correctly. Thankfully my intelligence and reading comprehension is still intact. I haven’t “gone emeritus” yet.

My basic question is whether or not we have gone into a different regime than the studied time period. I am not sure that the conclusion that we should not expect any nasty surprises in the near future really holds, although the study itself only really seems to have implications about how fast we will get to 2x CO2.

[Response: Thanks, and your question is the critical one, as several have brought up. And they’re only suggesting, based on their analysis, that high feedbacks might be unlikely–not strongly concluding it. Jim]

So, could this be put in other words:”During recession of solar orbital forcing (1050-1800 AD (the solar forcing was at maximum during the Holocene Optimum c.5000BC)), the response of the vegetation (well, ecosystem) was found to be trailing the atmospheric CO2 98,5% of the time.”? As is known the lag was 800 years during the increasing orbital forcing at the end of the last glacial, because that’s how long it took for CO2 to start to decrease after the Tmax. (I know this is not accurate description, just trying to get a handle in more familiar terms…)

Nice touch to end the analysis in 1800 (for 1815 Tambora effects to be exluded). Is the CO2 lag a bit larger during sudden changes than during slow changes? Is this the signal they have calculated experimentally? I might be way off here, really difficult for a mere Biochem. M.Sc., please correct, if I’m way off, as this is truly difficult to get.

[Response: No, you’ve mixed up things there. Orbital forcing is not involved and the 98.5% refers to the CO2 lagging temperature. 1800 was chosen so as to avoid the industrial CO2 source period. It had nothing to do with Tambora. Jim]

Pablo, the short answer is that the paper makes false claims backed up by equations that don’t apply.

G&T’s paper, for example, would have blankets not working because they would violate the second law of thermodynamics (by their interpretation). I.e. despite being colder than your body at the top of the blanket, they keep your body warmer, therefore transferring heat from that cooler blanket to the warmer body.

Despite this, blankets continue to keep billions of people warm each night.

With such a infant-school level error, what makes you think that the rest of the paper would be any good?

Brian: “whereas this study only seems to apply to the last 1000 years ? ”

No, the study applies where CO2 is outgassing in the trillions of kilos a year range.

800 years apply where the earth orbit has more solar energy reach the ground to warm the planet.

It is that the earth orbit change doesn’t apply for the last 4000 years and that CO2 outgassing at that rate doesn’t apply after about 200 million years and until about 100 years ago that the 800 year lag doesn’t apply today.

The cause is different.

Surely if the cause is different, you’d expect the delays to be different, absent any better idea.

On the methane release from tundra/deep water issue, the problem with them (just like with glacier breakoff and melt) is that these are conditionally stable states. As long as the situation doesn’t change, they remain stable.

Predicting their change would be like predicting when an inverted cone will fall over from its balance point. All you *know* is that unless you’re very careful, it will fall over some time. About the only way to make it predictable is to cause the change yourself.

Another mention here of the response of CO2 to temperature change — as with the above paper I’ve only seen the abstract; just curious if they’re related work:http://www.agu.org/pubs/crossref/2009/2009GL040975.shtml
(hat tip to http://delayedoscillator.wordpress.com/
“… transition can be explained if the response time of CO2 to a global temperature fluctuation has lengthened from 6 months to at least 15 months. A longer response time may reflect saturation of the oceanic carbon sink, but a transient shift in ocean circulation may play a role.”

There are 2 fundamental questions left by this study:
1) Does the time period in question capture the dynamics we expect in the next 200 years?,
and
2) Does the cyclicity caught in the record (primarily cycle lengths of 50 yr for T) long enough to get a strong feedback from the C-cycle?

Longer time cycles are important for policy, by the way,since one of the pertinent questions being asked is to what extent is the T-baseline being offset.

# 62. Hank Roberts – Which set? The ones use by the study. Are you this guy? No. That’s one disadvantage of having a common name like John Phillips. In the interest of full disclosure, I am a mild skeptic though.

A low-order physical-biogeochemical climate model was used to project atmospheric carbon dioxide and global warming for scenarios developed by the Intergovernmental Panel on Climate Change. The North Atlantic thermohaline circulation weakens in all global warming simulations and collapses at high levels of carbon dioxide. Projected changes in the marine carbon cycle have a modest impact on atmospheric carbon dioxide. Compared with the control, atmospheric carbon dioxide increased by 4 percent at year 2100 and 20 percent at year 2500. The reduction in ocean carbon uptake can be mainly explained by sea surface warming. The projected changes of the marine biological cycle compensate the reduction in downward mixing of anthropogenic carbon, except when the North Atlantic thermohaline circulation collapses.

Much of that needs revision now. The entire concept of the “oceanic conveyor belt” and the thermohaline circulation (THC) has been modified – the subsurface current flow is far more complicated than what the textbooks show. As a result, the THC is now the MOC – the meridional overturning circulation – and there’s also a greater understanding of the importance of wind-driven upwelling. The main problem in this area is the lack of comprehensive historical subsurface data for much of the oceans, to test models against.

Simple relations between sea surface temperatures and ocean carbon uptake are now known to be heavily complicated by winds – and the magnitude and even the direction of the feedback effect seems to vary quite a bit from model to model:

Very cool stuff; it’s great to see every flippin’ thing is being looked at in climatology. That’s two papers that show that there are feedback mechanisms that mitigate global warming in a row!

Uh, of course every flippin thing is being looked at. Contrary to the caricatures drawn up by denialists, climate scientists really are trying to understand everything about the climate.

Frank, how do you figure this “feedback mechanism mitigates global warming” ??? Quite the contrary, I’m afraid. This is not any sort of negative feedback, nor does it alter the response of temperature to changing GHG levels.

What this paper attempts to define is *how much more CO2 is released from the natural system in response to increasing temperatures*. IE, we can pump X Gton of CO2 into the atmosphere and raise temperature Y degrees. As a result we’ll get a bonus round of CO2, gratis, from mother nature.

Hank and others were discussing whether we could stave off snowball earth with such low values of gamma.

Wouldn’t low values of gamma be a good thing in this respect? Assuming the CO2 response is symmetric around deltaT regardless of sign, lowering temperatures from Milankovitch cycles would remove CO2; a lower gamma would remove less CO2 and cause less change than would a high gamma.

OK – any idea where that meme came from and why you’re so sure it’s wrong ? What is the best view as to the dynamics of the carbon cycle in response to temperature rises ?

Completely Fed Up

Surely if the cause is different, you’d expect the delays to be different, absent any better idea.

Personally I’d have thought that if a temperature rise caused a rise in CO2 then the rise in CO2 would be the same regardless of the cause of temperature – the oceans, for instance don’t know where the temperature rise is coming from, just that it’s warmer.

Speculating, I guess there could be exceptions such as solar heating of newly exposed previously snow covered areas where radiative heating would be significant, but that’s likely the exception rather than the rule ?

“To the issue of potential permafrost releases of CO2: If the equlibrium concentration of CO2 versus temperature is smooth enough such that a linear approximation over policy relevant values makes sense, then I would expect that this study does cover permafrost feedbacks.”

Hmmm….. Let’s say I make a careful study of the behavior of water, at sea level, over the temperature range from from 10C to 90C – now, as long as I am very accurate, I should be able to predict the behavior of water at -10C or 110C with no problems, right?

Wrong. Linearity is often local in the real world, and assuming otherwise doesn’t make much sense.

Trying to hide all this variability within a single variable, gamma, seems odd as well. Is there a “cryosphere sensitivity” factor? In terms of, say, ice volume lost per degree C increase? Or does that depend primarily on where the ice is, and what state it is in?

Likewise, the Charney sensitivity doesn’t tell you much about how permafrost will respond over time because of the Arctic amplification effect.

Put it this way: should we expect such “gamma factors” to remain constant over space and time? If not, shouldn’t we go and measure gamma in many different locations, from permafrost to tropical wetlands, over long periods of time, rather than going back to a poorly-characterized pre-industrial period and then projecting those estimates forward?

Mike Tabony says, “Science is the rules we “play” by. It cannot “lose”.”

Here, here! My only regret is that I won’t be there to see the looks on the faces of the denialists when they find that with all the slander, character assassination and fault-finding, they have accomplished precisely bupkis. The evidence is untouched, towering over them, waiting to come crashing down.

The current outcry in the public and press is nothing but an adolescent tantrum by a species collectively saying “I don’ wanna!” Nature doesn’t care. She just says “Suit yourself. Wise up or die.”

Could someone (anyone) comment on whether my logic/math was correct in comment 30 and so whether the accompanying graph is also correct (see here)? [If not, RC should delete the whole post, to avoid having the graph found and circulated… there’s enough misinformation out there already.]

If it is correct…

It works out that each ppm of more (or less) CO2 feedback per degree C then expected does two things. The first is to subtract/add about one year from/to the time it will take, at the current annual rate of CO2 emissions, to reach a committed temperature increase of 2 C. The second is to add/subtract about 1/70 C to/from the temperature increase to which we’ve already committed by raising CO2 to 387.5 ppm.

So… if the models overestimated gamma by 7 ppm/C, then we have about 7 more years than we thought before we reach 2 C, and we’ve already raised the final global temperature by about 0.1 C less than we think.

Similarly, if the models underestimated gamma by 7 ppm/C, then we have 7 years less than we thought before we hit 2 C, and we’ve already raised the final global temperature by 0.1 C more than we think.

My point is… in the scheme of things, these numbers are small, although there’s more margin for error one way than the other (i.e. we’re very close to that 2 C number already). Of course we’re dealing with estimates and averages when the world is actually made of discrete mechanisms and events. I agree with Eli in that increase will likely be realized as step functions (e.g. the relatively sudden/rapid transition of the a large chunk of the Amazon or some other forest into a savanna, or transition of grasslands into desert, or a limit/change in ocean CO2 uptake).

Which isn’t to say that the subject isn’t important, or very interesting, or doesn’t hold valuable nuggets to be learned about what might or might not happen. It’s definitely an interesting case of chicken/egg sleuthing.

Completely Fed Up, thanks. Gavin directed me to the dedicated page, also. The point of my second comment is that the blogosphere is a confusing place for people working on social vulnerabilities to environmental change – we who rely on impact scenarios – and so shortcuts to scientific rebuttals of particular memes helps. There are many of us out here who are trying to get a better grasp on the physical science in the time we can spare away from social science and policy matters – and we ARE the targets of misinformation. So, a short response to your question, “With such a infant-school level error, what makes you think that the rest of the paper would be any good?”: I am not a physical scientist, and I rely on people like you to characterize that error as “infant-school level”.

re “The authors found that in 98.5% of their regressions, CO2 lagged temperature. There will undoubtedly be those who interpret this as evidence that CO2 cannot be a driver of temperature, a common misinterpretation of the ice core record. Rather, these results from the past millennium support the usual interpretation of the ice core record over the later Pleistocene, in which CO2 acts as a feedback to temperature changes initiated by orbital forcings (see e.g. the recent paper by Ganopolski and Roche (2009)).”

“The authors find that CO2 lags the temperature change, so therefore CO2 did not drive the temperature change”, is not a misinterpretation. You say the same thing in the last sentence … “that CO2 acts as a feedback to temperature changes intiated by orbital forcings.”

What you omit to define, is what you mean by “feedback”. Do you mean that CO2 release in response to increased temperatures produces a further increase of the temperature (a positive temperature feedback)?

I don’t know anyone who disagrees with that among climate realists. All agree that increase in CO2 will increase temperature in a log-concentration dependent way. What it will not do is result in a runaway greenhouse overwhelming the “orbital forcings”.

If you’re going to trash climate-realists, then do so honestly and honorably, not putting words of straw in their mouths.

Okay, I’ve extended my semigray model to cover a hemispherical planet with nine latitude zones, each 10 degrees “high.” I set water vapor in each latitude band by the Clausius Clapeyron equation and adjust so the average matches the present Earth average (about 392 pascals). I also added a term to represent advective heat transfer from one band to another. I can reproduce the gross radiative effects on Earth, but the latitude distribution is horrible, and I don’t know why. Here’s the final printout of the model, which uses a time-marching scheme:

(Boy, I hope that comes out readable on the blog. Cut and paste into Word and use a proportional font to see the original, in case it’s inintelligible here.)

My mean global annual temperature is 276 K, which isn’t too horribly far off and could be tweaked by making this and that more realistic. But the latitude bands average from 203 K at the poles to 357 K at the equator, which is ridiculous. If anyone has any suggestions, I could sure use the help. If you want to see the code, it’s relatively short–I wrote it in Just Basic, a great language for quick-and-dirty proof-of-concept programs.

Thanks for your thoughts. Though I am not sure if you were responding to what you perceived as a note addressed to you or as an independent observation. (My response No. 48 was to Jim Redden’s post No. 19.)

As to forest land recovery, you may be right regarding developed countries. However, lesser developed countries are doing their best to exchange their resources for economic growth. To this end millions of hectares are being denuded, (slash and burn), with little action to replant original species. In their stead either the top soil erodes away, they plant rubber trees, bananas, oil palms, or the land goes to housing and/or other AG purposes. I would that your observations were global; however, based on the latest satellite analysis it appears that universally net global losses are averaging 2-3% every year.

(A note for the uninitiated: We have to keep in mind that where land makes up, by area, about 28% of the CO2 sink, most carbon cycle models suggest an actual land CO2 uptake on the order of 48% of total annual CO2 emissions. On the other hand where the ocean surface area is roughly 66% of the total surface area, the best carbon cycle estimate suggests its participation in uptake is roughly 49% of total CO2 emissions. The main difference between land and ocean carbon sinks are that the ocean offers a long term, geologic carbon sink capacity which can be measured in millions of years.)

I’m sorry I digress, obviously there is more research to be done in regards to the carbon cycle models. (IMHO, other things may need to be re-evaluated as well, like the possibility that cement production may actually be a near net 0 emission.)

The main point I was trying to make to Mr. Redden was that the opportunity to recover CO2 has been negatively impacted by increasing populations of man, as you noted. I suspect increased weather would quickly effect a recovery were it not for the anthropogenic impact. (We have to keep in mind that the Earth was in the condition man found it due to entropy and it would likely quickly return to a similar condition based on the age/physics/chemistry of the Sol system, without man.)

Separately, what I was suggesting by the pumping of biomass underground was a test similar to “FutureGen”. It would be interesting to see if anthropogenic actions could make a statistical difference in the CO2 or surface temperature trend in the opposite direction for a change.

….pretty warm weather up here in the sub-arctic lately. I can almost hear the ice melting. Oh, that is the ice melting and I guess I’ve got a leak in the roof:-) The good news is the furnace has finally stopped running. Oh…never mind, there it goes now.

What is the expected sea level rise by 2400, 2500, 3000? Does anyone do long term models, or best guesses? Any other items modeled that far out: what kind of global avg temperatures by 3000? Could you point me to readable lay summary, thanks

I realize that the year 3000 is outside the relevant policy time period, but as a matter of morbid interest – what sea level is expected by 3000? What global average temperature? Have any long term studies been done?

Nice post! I would like to ask if the CO2 emissions of the places which had high CO2 emissions decreased? Well, the whole world is concerned on the progress of these places. With all of these CO2 emission reduction campaigns, are there positive news that are needed to be heard?

68 Thomas “a linear approximation over policy relevant values makes sense, then I would expect that this study does cover permafrost feedbacks”
Why would permafrost only partly defrost even at one latitude? Maybe at different latitudes it would do different things, but “the farther poleward you go the greater the warming” leads me to believe that all those permafrost things respond like a tipping point rather than linearly. Maybe you mean that “policy” is short sighted? OK, supposing the permafrost can melt just a little, and supposing 57 Dr Nick Bone is correct in saying that the series converges. That’s good except that 57 Dr Nick Bone says it converges to 8 degrees C of warming. 8 degrees C of warming is beyond the extinction point for Homo Sap. Not good.

70 Jim Bouldin: Thanks for your clarification. It isn’t your thesis, you are just reporting it. So this gamma is a variable, or at least a “variable constant.” Gamma is different now than it was then. And Gamma isn’t related to the sensitivity even though their units are inverses. Both of those concepts bother me. Most of the time constants are constant. Most of the time, inverses have physical meaning and people theorize in the Physics Department by playing with the math. Since I don’t have a good enough equation for the variable constant, I’m not sure I should do anything at all with it.

[Response: I was simply trying to make the point that carbon dioxide can force, and be forced by, temperature. “Constants” can easily be scale dependent–happens all the time. An interest rate is not the same number if you calculate it over different time intervals. Jim]

On the other hand, it is interesting that somebody was able to find a gamma representing the variation in CO2 with temperature. Still, I’m not ready to assume that there is a nice equation that can assure me that Dr Nick Bone is correct about convergence. I think it is a better idea to go measure the size of the tundra peat bogs, etc..

Yeah, but I still don’t get it. The lag was optimised over that range, so the gammas presented are for the optimums, no? Lag variation is not part of the ensemble.

[Response: Yes, that’s my understanding of it.]

So was optimisation done once for the whole ensemble together, or individually for each member? Second one I think.

[Response: I would think it would have to be the latter, but am not sure]

Do they tell us somewhere what the (mean) optimum lag was?

[Response: That’s the one thing I was looking for more detailed info on and didn’t see. Seems important.]

“There was no Monte Carlo simulation involved: it was a full factorial experiment as far as I can tell.”

Yeah again. “Full factorial experiment” sounds rather grand, but of course that was only possible here because they predefined (sampled!) their key variables to only take certain discrete values (the smoothing spline lengths and calibration intervals). In engineering, the term M-C tends to get applied (rather loosely) to all such ensemble approaches, whether the sampling is strictly random, stratified random, or discrete and exhaustive (as here).

[Response: Does one ever test every single possible value within the range of a given factor? Of course not–it’s impractical to impossible. Monte Carlo refers to the random choice of values from a defined distribution–which is not what they did.]

Just read the Elisabeth Rosenthal NYT A1 article “Skeptics Find Fault With U.N. Climate Panel”. The entire article is based on Pielke and Mockton and casual readers are going to assume that there’s fire in all that smoke. Argh. I don’t follow Rosenthal closely. Is she always this bad? It’s not just that she treats, oh, Mockton as a serious critic, she doesn’t provide enough to context for the reader to know that: 1. The smoke created by Pielke and Mockton is just smoke. Assuming every charge is true doesn’t change anything. 2. The charges are, generally, false. And, 3. Mockton, Pielke, and Co. have serious credibility issues themselves. Doh. The byline could have been Anthony Watts.

Personally I’d have thought that if a temperature rise caused a rise in CO2 then the rise in CO2 would be the same regardless of the cause of temperature – the oceans, for instance don’t know where the temperature rise is coming from, just that it’s warmer.

Yeah, but the oceans “know” the partial pressure of CO2 in the atmosphere above them.