Good news for the earth’s climate system?

How much additional carbon dioxide will be released to, or removed from, the atmosphere, by the oceans and the biosphere in response to global warming over the next century? That is an important question, and David Frank and his Swiss coworkers at WSL have just published an interesting new approach to answering it. They empirically estimate the distribution of gamma, the temperature-induced carbon dioxide feedback to the climate system, given the current state of the knowledge of reconstructed temperature, and carbon dioxide concentration, over the last millennium. It is a macro-scale approach to constraining this parameter; it does not attempt to refine our knowledge about carbon dioxide flux pathways, rates or mechanisms. Regardless of general approach or specific results, I like studies like this. They bring together results from actually or potentially disparate data inputs and methods, which can be hard to keep track of, into a systematic framework. By organizing, they help to clarify, and for that there is much to be said.

Gamma has units in ppmv per ºC. It is thus the inverse of climate sensitivity, where CO2 is the forcing and T is the response. Carbon dioxide can, of course, act as both a forcing and a (relatively slow) feedback; slow at least when compared to faster feedbacks like water vapor and cloud changes. Estimates of the traditional climate sensitivity, e.g. Charney et al., (1979) are thus not affected by the study. Estimates of more broadly defined sensitivities that include slower feedbacks, (e.g. Lunt et al. (2010), Pagani et al. (2010)), could be however.

Existing estimates of gamma come primarily from analyses of coupled climate-carbon cycle (C4) models (analyzed in Friedlingstein et al., 2006), and a small number of empirical studies. The latter are based on a limited set of assumptions regarding historic temperatures and appropriate methods, while the models display a wide range of sensitivities depending on assumptions inherent to each. Values of gamma are typically positive in these studies (i.e. increased T => increased CO2).

To estimate gamma, the authors use an experimental (“ensemble”) calibration approach, by analyzing the time courses of reconstructed Northern Hemisphere T estimates, and ice core CO2 levels, from 1050 to 1800, AD. This period represents a time when both high resolution T and CO2 estimates exist, and in which the confounding effects of other possible causes of CO2 fluxes are minimized, especially the massive anthropogenic input since 1800. That input could completely swamp the temperature signal; the authors’ choice is thus designed to maximize the likelihood of detecting the T signal on CO2. The T estimates are taken from the recalibration of nine proxy-based studies from the last decade, and the CO2 from 3 Antarctic ice cores. Northern Hemisphere T estimates are used because their proxy sample sizes (largely dendro-based) are far higher than in the Southern Hemisphere. However, the results are considered globally applicable, due to the very strong correlation between hemispheric and global T values in the instrumental record (their Figure S3, r = 0.96, HadCRUT basis), and also of ice core and global mean atmospheric CO2.

The authors systematically varied both the proxy T data sources and methodologicalvariables that influence gamma, and then examined the distribution of the nearly 230,000 resulting values. The varying data sources include the nine T reconstructions (Fig 1), while the varying methods include things like the statistical smoothing method, and the time intervals used to both calibrate the proxy T record against the instrumental record, and to estimate gamma.

Some other variables were fixed, most notably the calibration method relating the proxy and instrumental temperatures (via equalization of the mean and variance for each, over the chosen calibration interval). The authors note that this approach is not only among the mathematically simplest, but also among the best at retaining the full variance (Lee et al, 2008), and hence the amplitude, of the historic T record. This is important, given the inherent uncertainty in obtaining a T signal, even with the above-mentioned considerations regarding the analysis period chosen. They chose the time lag, ranging up to +/- 80 years, which maximized the correlation between T and CO2. This was to account for the inherent uncertainty in the time scale, and even the direction of causation, of the various physical processes involved. They also estimated the results that would be produced from 10 C4 models analyzed by Friedlingstein (2006), over the same range of temperatures (but shorter time periods).

So what did they find?

In the highlighted result of the work, the authors estimate the mean and median of gamma to be 10.2 and 7.7 ppm/ºC respectively, but, as indicated by the difference in the two, with a long tail to the right (Fig. 2). The previous empirical estimates, by contrast, come in much higher–about 40 ppm/degree. The choice of the proxy reconstruction used, and the target time period analyzed, had the largest effect on the estimates. The estimates from the ten C4 models, were higher on average; it is about twice as likely that the empirical estimates fall in the model estimates? lower quartile as in the upper. Still, six of the ten models evaluated produced results very close to the empirical estimates, and the models’ range of estimates does not exclude those from the empirical methods.

Figure 2. Distribution of gamma. Red values are from 1050-1550, blue from 1550-1800.

Are these results cause for optimism regarding the future? Well the problem with knowing the future, to flip the famous Niels Bohr quote, is that it involves prediction.

The question is hard to answer. Empirically oriented studies are inherently limited in applicability to the range of conditions they evaluate. As most of the source reconstructions used in the study show, there is no time period between 1050 and 1800, including the medieval times, which equals the global temperature state we are now in; most of it is not even close. We are in a no-analogue state with respect to mechanistic, global-scale understanding of the inter-relationship of the carbon cycle and temperature, at least for the last two or three million years. And no-analogue states are generally not a real comfortable place to be, either scientifically or societally.

Still, based on these low estimates of gamma, the authors suggest that surprises over the next century may be unlikely. The estimates are supported by the fact that more than half of the C4-based (model) results were quite close (within a couple of ppm) to the median values obtained from the empirical analysis, although the authors clearly state that the shorter time periods that the models were originally run over makes apples to apples comparisons with the empirical results tenuous. Still, this result may be evidence that the carbon cycle component of these models have, individually or collectively, captured the essential physics and biology needed to make them useful for predictions into the multi-decadal future. Also, some pre-1800, temperature independent CO2 fluxes could have contributed to the observed CO2 variation in the ice cores, which would tend to exaggerate the empirically-estimated values. The authors did attempt to control for the effects of land use change, but noted that modeled land use estimates going back 1000 years are inherently uncertain. Choosing the time lag that maximizes the T to CO2 correlation could also bias the estimates high.

On the other hand, arguments could also be made that the estimates are low. Figure 2 shows that the authors also performed their empirical analyses within two sub-intervals (1050-1550, and 1550-1800). Not only did the mean and variance differ significantly between the two (mean/s.d. of 4.3/3.5 versus 16.1/12.5 respectively), but the R squared values of the many regressions were generally much higher in the late period than in the early (their Figure S6). Given that the proxy sample size for all temperature reconstructions generally drops fairly drastically over the past millennium, especially before their 1550 dividing line, it seems at least reasonably plausible that the estimates from the later interval are more realistic. The long tail–the possibility of much higher values of gamma–also comes mainly from the later time interval, so values of gamma from say 20 to 60 ppm/ºC (e.g. Cox and Jones, 2008) certainly cannot be excluded.

But this wrangling over likely values may well be somewhat moot, given the real world situation. Even if the mean estimates as high as say 20 ppm/ºC are more realistic, this feedback rate still does not compare to the rate of increase in CO2 resulting from fossil fuel burning, which at recent rates would exceed that amount in between one and two decades.

I found some other results of this study interesting. One such involved the analysis of time lags. The authors found that in 98.5% of their regressions, CO2 lagged temperature. There will undoubtedly be those who interpret this as evidence that CO2 cannot be a driver of temperature, a common misinterpretation of the ice core record. Rather, these results from the past millennium support the usual interpretation of the ice core record over the later Pleistocene, in which CO2 acts as a feedback to temperature changes initiated by orbital forcings (see e.g. the recent paper by Ganopolski and Roche (2009)).

The study also points up the need, once again, to further constrain the carbon cycle budget. The fact that a pre-1800 time period had to be used to try to detect a signal indicates that this type of analysis is not likely to be sensitive enough to figure out how, or even if, gamma is changing in the future. The only way around that problem is via tighter constraints on the various pools and fluxes of the carbon cycle, especially those related to the terrestrial component. There is much work to be done there.

378 Responses to “Good news for the earth’s climate system?”

#148–Edward, you are off-base. It was not me who made substantive statements without experimentation, but G & T. My point was simply that logic is sufficient (in many cases at least) to diagnose a misapplied analogy.

Perhaps your sermon on the philosophy of science could be redirected to them?

As an aside, using natural or base 10 logs don’t really matter in the end (which is why I got to 1.41, too). It just changes the constant. Where a natural log gives you RF = 5.35*ln(C2/C1), a base 10 log gives you RF = 9.996*log(C2/C1), but you get exactly the same answer, so no worries there [the statement log(x)=ln(x)*(log(k)/ln(k) is true… sorry, my dad was a math professor, so I grew up liking that stuff].

[Response: OK, I figured it was along those lines, since you got the right value. The typo on the value of k threw me some.]

Yes, I agree completely on having to make a lot of (too many) assumptions, in particular whether or not any CO2 feedback has yet been realized, what the time scale is, and what the actual feedback values were in the models…. although what I’m computing (committed temperature) is mostly independent of time scale (we might see that temp in 20 years, or 200 years — the equation only says what we’ll reach, not when).

[Response: Right, but if part of what one is observing in the CO2 record is due to relatively fast feedbacks from T changes, say 10 years lag, your numerator includes that effect already. E.g. in 2010, at 388 ppm, let’s say of the difference of 108 from pre-industrial, 10 ppm is due to T induced feedback. That is, the numerator should be 378 + 10 = 388, therefore a 10ppm feedback, not a 0 ppm feedback. This will shift all your curves. Granted though, this only holds if the feedback is fast relative to the previous century’s T increase–jim]

The one exception is in determining feedback to date. But I’d be willing to bet (based on the mechanisms that would be involved) that we’ve seen negligible positive CO2 feedbacks to date, or we’ve nullified them with civilized behavior (forest clearing for wood products in place of forest fires), or perhaps there even would have been negative effects that we’ve also nullified the same way (e.g. increased plant growth and corresponding CO2 uptake has been restrained by annual harvesting and foresting of land).

[Response: Yes, I would tend to agree the feedback CO2’s probably not been great so far. But land use change has not offset it, but rather added to it, and in fact far oustrips it. One way to estimate gamma would be to use the info on time lags from the study, if we had it, and apply it to the industrial era. In fact that’s about the only way to estimate it, given the uncertainties in the carbon cycle’s fluxes–Jim]

But based on what I’ve read here since I did those computations, my takeaways are:

1) Time scales for CO2 feedback are probably longish. There could even be negative feedbacks in play right now (ocean uptake, increased plant growth) that will reverse and be replaced with longer term positive feedbacks (desertification/savanna-fication, etc.) that only arise when certain regional tipping points are reached (e.g. frequent and powerful droughts in the Amazon or central USA resulting in frequent and massive forest fires).

[Response:Hornet’s nest here. There definitely are strong negative feedbacks in play; the study is only trying to estimate the temperature induced feedback, not to quantify all feedbacks. There may well be “shortish” lag feedbacks that show up unexpectedly, especially if there are major droughts in forested areas. Your overall point is the most critical one though, and is exactly the critical point: temperature will drive many of these longer term feedbacks. Hence, gamma will likely be higher, at some point, than values given here–Jim]

2) The models are apparently not far from this study, so maybe I never should have plotted lines for overestimating feedback by 14 ppm/C or 21 ppm/C or 28 ppm/C. If the models are mostly in the 10 ppm/C range, and that’s where this study falls, then it doesn’t make sense to assume an “over estimation” much beyond 10 or 12 ppm/C (which would equate to zero or a possible slightly negative rather than positive feedback). It would be possible to go to any degree in the other direction, however (e.g. up to 30 ppm/C more CO2 feedback than modeled), but as I said already, that’s way too scary to even think about. It means we’re already committed to temperatures over 2C no matter what we do.

Anyway, I’ll take my bottom line result to be mostly accurate, then…

Each ppm/C more feedback than expected puts us one year closer to reaching the point of no return on hitting 2C, and puts the current temperature commitment 1/70 C higher.

Each ppm/C less feedback than expected puts us only one year further from reaching the point of no return on hitting 2C, and puts the current temperature commitment 1/70 C lower.

Which means that while this study is very interesting, the net impact of the unknowns on the climate (compared to anthro CO2) is marginal with respect to short term planning and timing. It only really matters if we’re so foolish as to let things get way out of control.

You are right Jim and I am wrong, but we’re actually not far from semantics. When you repeatedly sample a parameter space to drive a model and accumulate a result distribution, that is statistical simulation. All we’re talking about is the sampling scheme. Exhaustively working through a discrete grid (as here) has the advantage of completely eliminating stochastic sampling bias, but at the expense of possible systematic bias (operator chooses the samples). It’s also very inefficient, as you point out. Am I saying this was a poor choice here? Of course not.

BTW, one cannot apply this approach without “a defined distribution” for each parameter. The authors chose those, although they don’t express it that way. For example, they made each of the temperature reconstructions and each of the CO2 series equally likely (discrete uniform distributions). They made similar choices for the smoothing splines and calibration intervals (continuous uniform distributions – which have defined bounds). Again, poor choices? Of course not. But choices; yes.

This is a powerful technique, but I don’t think one should pretend that it does things it does not. There’s just the faintest whiff of that in the author’s “…229,761 estimates for gamma” and, with respect, in your “full parametric experiment”. The results distribution necessarily depends on the parameter distributions. The degree of dependence here is not particularly high (as the supplementary demonstrates), which gives some confidence in the robustness of the result.

[Response: I guess I’m still not sure what it is exactly that you think is being pretended here. Nobody’s pretending that these are iron-clad results wrt the future–jim]

I think it’s interesting to consider that it’s probably a gross oversimplification (one of many, I know) to label gamma as a linear value. It’s quite probably not (e.g. 1 C may give you 10 ppm, but the next 1 C might give you 20 ppm more if it comes too quickly on the heals of the first 1 C increase).

I’m rather afraid that one of the factors that we can’t easily quantify is that CO2 is very much a part of the cycle of living things, and living things, given time, would adapt and compensate at least to some small degree. But if it happens too fast for ecosystems to adjust (e.g. for plants better adapted to the changed environment to assume a CO2 mitigating role) the effects could be far worse.

I don’t suppose anyone is doing a study to itemize and parametrize the various potential specific, individual CO2 feedbacks in the system. It would be interesting to see a time-line with all of the “coulds” and their dependencies (e.g. if precipitation falls below X and temperature above Y, which might be expected sometime between years A and B, then forest M could add between R ppm and S ppm CO2 to the atmosphere).

[Response: Oh yes. Lots going on there, from all angles. You just don’t hear much about it. I hope I can help change that.–Jim]

No one could ever put that into a single prediction of any sort (although they could all be built into models, and I assume many are), but it would be nice to sort of see what the list is, what’s big and small, what is surprising and unexpected, and what would add up… and, God forbid, check them off as they come to pass.

[Response: Wicked problem for sure. Constraining the carbon cycle budget is critical. We’re playing with fire with our ignorance of it. It’s the BIG wildcard.

No, I am just asking for reasonable projections (and ranges of projections) for the next 25, 50, and 100 years. The figure that I first commented had no indication of when that much melting would have occurred.

151 Kevin McKinney: OK. Whoever. The point is that whoever is completely new to science needs to start with experiments. Science is something you DO, not something you READ. So, to whoever is completely new to science: The way you tell who is telling the truth and who is telling lies is by doing an experiment personally. For people with degrees in music or english or whatever that isn’t science, don’t be embarrassed to buy a chemistry [or other experiment] set from a toy store and start there. That is where the scientists started. Don’t worry about age.

You don’t need to know individually what all the forces are on a log rolling downhill are. You can roll the log down hill (or check the results of past log rolling downhill, measuring the grade) and check the velocity of the log through its rolling passage.

That you don’t know the energy loss through the uneven surface of the rough-hewn log doesn’t mean that gravity doesn’t pull down with a force of 9.8Newtons. That you measure that force as 10N +/- 1N from your log rolling measurements doesn’t mean that it’s outside that range because you didn’t account for a heavy clump of grass.

Yet dittos do exactly this for reconstructions of temperature sensitivity of CO2 changes.

BPL: With the proviso that it’s a moral philosophy directed toward one particular end, finding the truth about nature. Being a scientist does not make you a moral person–witness Nobel prize winner Philip Lenard denouncing relativity as “Jewish Science,” Werner Heisenberg trying to build an A-bomb for Hitler, Bill Shockley on race and IQ, etc.

Re: 145. I think you need to consider uptake of CO2. The way I see the 10 ppm/C line is as an equilibrium state, which we are now well above due to injecting fossil carbon. When above this line there should be a net uptake, and when below a net CO2 emission. The uptake rate can be estimated from knowing we only see half the increase of CO2 that is injected, so it must be about 2 ppm/yr, given a rise of 2 ppm/yr. Whether the uptake rate is constant or in some way dependent on the distance from equilibrium is not easy to know, and also whether the 10 ppm/C line still applies in an increased carbon environment, but uptake is a first order term in this type of calculation that can’t be neglected.

I have BOINC running here on an offsite backup server. The box uses a little more electricity as a result, ~430kWh/year for a cost of about $20. The “Al Gore is Fat” freaks will undoubtedly find it completely unacceptable, but on the other hand as long as the thing is installed on machines that otherwise would be running anyway I believe it’s a net benefit in terms of resources and of course it saves researchers a pile of cash.

BOINC allows you to choose your contribution from a smorgasbord of interesting projects.

I was responding to a post, and I asked a question to clarify the post. If someone suggests that there is possibly a 60m sea level rise, it is not unreasonable to ask that person when the sea level rise will occur.

Septic Matthew: engineers design for the top value in the range. Always. Then, then civil and structural engineers I know (out of an excess of caution) multiply by two. Or five. Or as much as they can get away with, really.

And what sort of building only stands for a hundred years? Many engineers have to take far longer timescales into account (which includes once in a century storms and similar considerations).

Septic Matthew (172) & Didactylos (173) — The other two are in Britian and in California. While it is true that sea defenses have to take storm surge and so on into account, it seems the three groups all independently arrived at 1+ m as suitable with those in California, I think, actually having a somewhat larger figure.

While I suppose engineers in Venice are also alying plans, so far haven’t seen anythiing for New Orleans and around to Washington, D.C., where Foggy Bottom may well have to be renamed; SLR is not expected to be uniform due to mass redistribution and so changes in geodesy.

I actually asked the same Q about a 60 m rise some 2 years ago here, and the answers I got ranged from 2.5 m per century worst case to 5 meters per century worst case (based on paleoclimate research). Of course, we’re warming the world a lot faster, but still it takes a lot of time to melt ice, esp when winter still comes around each year to cool it down again. From these above rates, there could be a 60 m rise in 1,200 years or by 3,200. Or bec we’re warming up the world faster, maybe less than this.

I was especially interested bec I was writing a screenplay, HYSTERESIS, set in the future of a dying world, that had nevertheless managed to invent time travel, and sent a soldier back to kill the fictitious U.S. pres (so his better VP could take over tha steer the country/world on a path to avert such future destruction). For cinematic effect, I wanted shots of DC in a 60 m rise (you’d see the top of the Capitol buidling and the W Monument), and used that timeline to set my story. BTW (don’t want to spoil the ending) but the hero fails to kill the pres (a pacifist woman gets to him), with an intereting outcome.

Only problem now with my screenplay is that hysteresis isn’t the worst that can happen (100,000 years of extreme warming in which most of life on earth dies out, but then comes back slowly to eventually flourish again). According to Hansen the worst, which he says is likely if we burn all fossil fuels, including tar sands and oil shale, is the Venus syndrome, in which ALL life on earth dies out. See: http://www.columbia.edu/~jeh1/2008/AGUBjerknes_20081217.pdf

So anyway, if I rewrite my screenplay it will be under a different title: THE VENUS SYNDROME.

Some kind, yes. On one of these threads I read the use of “septic” as a put-down pun on “sceptic”, so I self-agrandizingly took it as a name, like Yankees did with “yankee” and North Carolinians did with “tar-heel”, Indianans did with “hoosier”, and some others. And your name? I gather it means that you type with only two fingers, but you turn it into self promotion?

176, Lynn Vincentnathan: I actually asked the same Q about a 60 m rise some 2 years ago here, and the answers I got ranged from 2.5 m per century worst case to 5 meters per century worst case (based on paleoclimate research).

In most of the 20th century, the rate was 2mm/yr, but over the last 30 years might be higher at 3mm/yr.

174, Hank Roberts: Who are you attributing this to?

Take your own advice: check out post #107 by me.

Also, check out 63, by Hank Roberts. It’s a simple question, after all.

“Only problem now with my screenplay is that hysteresis isn’t the worst that can happen (100,000 years of extreme warming in which most of life on earth dies out, but then comes back slowly to eventually flourish again). According to Hansen the worst, which he says is likely if we burn all fossil fuels, including tar sands and oil shale, is the Venus syndrome, in which ALL life on earth dies out. See: http://www.columbia.edu/~jeh1/2008/AGUBjerknes_20081217.pdf”

Hansen’s hypothesis is nothing more than speculative crap with no empirical evidence. The earth cannot have a “Venus syndrome”.

Given that Jones has thrown Manns hockey stick under the bus and admited to the MWP and David Eyton, head of research and technology at British Petroleum who currrently fund, the CRU is investigating Jones; how much C02 would it take to re-inflate the tires on the bus, is the Earth was Venus and you all shared at least one brain?

RC: http://www.mediafire.com/?trm9gnmznde to which 181 Richard Steckis points does indeed have a 2.4 MB pdf attributed to Dr. Hansen that says we are in danger of going Venus [page 24] if we burn all of the coal and for sure go Venus if we burn all possible fossil fuel.
RC: Please please do a huge article on this possibility. Is it really Dr. Hansen’s work?

A propos Miskolczi, and considering water vapour, any increase in (tropospheric) temperature (for whatever reason) will release water vapour (from cloud evaporation) leading to a further increase in temperature (GHG effect of water vapour). Since clouds are an inexhaustible supply of fresh water vapour this effect should cause catastrophic run away. It doesn’t. Why not?

Steckis, have you even bothered to read the critiques that utterly demolish Miskolczi. The only possible merit I could find in it was the most “creative” application of the Virial Theorem–utterly wrong, but creative. I am almost afraid to ask this, Richard, but pray, what merit does this dog turd of a paper have other than as an exercise to Junior-level climate science students to find what’s wrong with it?

Actually, there is a point and purpose to Miskolczi’s paper. Now, he and his family will be treated to lavish vacations in exchange for unintelligible presentations at conferences sponsored by right-wing think tanks (an oxymoron if there ever was one.)

I concur, the eras around the Carboniferous M-P epochs may have been as close as the Earth would have come in the past; however, that was when Sol was cooler (On the order of 14/15ths…, (estimated), of the current incoming energy). Given the lesser heat content and high volume of water presence, there would not likely have been the initial conditions that existed on Venus.

It is likely a terrestrial or extra-terrestrial event may have covered a wide portion of the Earth in two separate planet wide events, eventually ending the Carboniferous era. (These events may have caused much of the die off of the Blue-Green Algae, that participated in the massive oxidation of iron and sulfur in the Earths land and oceans. In essence, high Carbon in the presence of high Oxygen content may have been enough to prevent the runaway heating during this time frame.) The resulting atmospheric chemistry was unlikely to support a Venus like climate.

The question is what would happen if you had high Carbon and higher Sulfates of dying biologic decay, without the high Oxygen content…, similar to what we are seeing in the oceans today?

This postulate is refuted by observation; it was cloudy and 30F(-1C) last night here in North Carolina. Today, it is 33F(0.5C), not cloudy, and there is ~ 2in(5cm) of new snow on the ground. Clearly clouds are a finite(not “inexhaustible”) source of snow(not “water vapour”).

This thread about sea level rise of 10s of meters has been off-track for a while.
Septic Matthew should go to post # 92 (Susan Kraemer): “I realize that the year
3000 is outside the relevant policy time period, but as a matter of morbid interest
– what sea level is expected by 3000? What global average temperature? Have any
long term studies been done?” The question is clearly in reference to the year 3000.
Septic Matthew writes in # 107: “I thought that even a 20 foot sea level rise was only
a typo, and that there was no reliable forecast of anything like that much. Are we
talking here about the year 3010 or something?” So, yes we should be “talking here about
3010 or something” and not going back and forth needlessly.

Ok. I’ll bite. What is the point? What are your qualifications for determining its veracity or do you just believe everything you read that purports to show that AGW can’t happen? I suspect the latter. Prove me wrong.

#177, Septic Matthew, & “In most of the 20th century, the rate was 2mm/yr, but over the last 30 years might be higher at 3mm/yr.”

Well, see, SM, this is where I’m sort of an expert. Have you ever had the duty of defrosting your SunFrost refrigerator (the one that save 90% electricity)? I thought so. It’s like this, the process starts out really slowly, then it starts speeding up, and toward the end it goes really fast. Furthermore, the paleoclimatologists have found that 2.5 m (some finding 5 m) per century (once the warming reaches a certain level, and the melting gets going in earnest) is what could happen….at least under the slow warming conditions of the past. Bets are sort of off for this unique, very fast, lickety-split warming we are involved with.

The most rapid known rise in sea level occurred at the end of the last ice age during the so-called Meltwater Pulse 1A. I’ve seen various estimates, but a typical one is a 25 metre rise in 500 years or 5 metres per century, 0.5 metres per decade.http://en.wikipedia.org/wiki/Image:Post-Glacial_Sea_Level.png

The actual forcing that triggered this was small compared to current climate forcings – initially only about 0.25 W/m^2 – we currently have 0.75 +/- 0.25 W/m^2 and rising. Somehow this initial trigger led to the build up of GHG’s and albedo changes (and more forcing) that resulted in this rapid rise in sea level. I don’t know exacly how soon after the trigger it occurred, I suspect it’s unknown.
You could theoretically calculate a maximum rate based on climate forcings and some albedo feedbacks, and an assumption of how much of the forcing went into melting ice. Let’s assume 100% of the forcing went into melting ice.

Lets say 1 W/m^2 forcing – thats 5*10^14 W over the globe.
The latent heat of melting for water is 334J/g or 3.34*10^17 Joules/km^3 water equivalent – i.e. water volume once melted.
Over the whole globe over a year, 1 W/m^2 corresponds to:
1W/m^2*(500*10^6km^2)*(1*10^6m^2/km^2)*(3600sec/hr)*(24/day)*(365day/yr)=1.58*10^22Joules
Dividing these two results gives:
1.58*10^22Joules/3.34*10^17 Joules/km^3=47,000km^3 melted water per year.
This is distributed over 350 million km^2 of ocean
47,000km^3/350 million km^2=0.13*10^-3km/yr=0.13 metres per year.
(For comparison each of Greenland and West Antarctica is estimated to be losing about 150 km^3 per year, for a combined sea level rise of a little under 1mm per year.)

So if all that 1 W/m^2 forcing went into ice melting it could raise sea level by 1.3 metres per decade or 13 m/100yrs. This is unlikely in itself as a good proportion of heat must go into warming the ocean, although this may stop if large amounts of ice spread through the oceans as ice sheets break up – creating an efficient way to transfer heat from ocean to ice, which incidentally is half of the key to the whole process – the other half being albedo changes on the ice itself (eg. meltwater ponds, soot), which transfer heat straight to the ice.

If you wanted to speed things up for fictional purposes you could double the forcing due to albedo changes, eg loss of the polar sea ice, formation of large meltwater lakes on ice. You could then add release of CO2 and methane from permafrost. In addition, in some areas, especially West Antarctica, the ice won’t need to melt – it will just flow off the surface it rests on once enough lubrication and buoyancy is present – this could happen in a decade or less (5 metre rise). Almost as rapid collapse is possible in Greenland – 7 metre rise. The stable ice sheet is East Antarctica – equivalent to about a 55-58 metre rise.

In short, I’ve no idea of the maximum possible rate. I could believe extreme conditions could do it within a century, but a number of worst-case positive feedbacks would have to be present.

Now this is where we lay people diverge from the scientists — any scientist will tell you that we certainly will have a Venus syndrome (runaway warming) when the sun goes supernova.

Can it happen before that, is the question. According to Hansen (who really knows his science well), the sun is slowly but surely becoming brighter over the eons. IF you add that into consideration (and that during the earlier extreme CO2-warming scenarios the sun was less bright), AND you add in the fact that we are releasing CO2/CH4 into the atmosphere a lot faster than in past warming events, AND you consider that the slow negative feedbacks, such as weathering drawing down CO2, are happening too slow to make much difference, AND you consider that Jim Hansen is a nice guy who doesn’t understand just how evil human nature is (that there are enough people out there who would actually speed up their GHG emissions just to spite others, despite economic harm to themselves — enough of those types to make a real difference), then you can see that the situation looks fairly bleak.