Wednesday, January 04, 2012

Another peek inside the settled science sausage factory: Tom Wigley on when the glaciers and small ice caps will melt: "the next step would be to try to get some realism here, but I really have no idea what would be realistic"

You are right that all the GSIC [glaciers and small ice caps] will melt even for zero warming from today. Our old model was better in that regard because it had a different eventual melt for different warmings. But this is a tricky thing for me even to conceptualize. My intuition says that if we stabilized temperature at today's level then not all of the GSIC ice would melt -- but how much would remain? It also seems reasonable that there is some warming amount that would ensure that virtually all the GSIC ice would melt --- 3degC, 5degC, ???

So the next step would be to try to get some realism here, but I really have no idea what would be realistic. ... However, I think what we have is fine out to 2400 for most cases. For zero warming from now, sea level rises at about 5cm/century initially and then more slowly later -- so it would take 1000 years or more for all the GSIC ice to melt. We never get far enough for things to look silly. This melt rate (%cm/century) is proportional to (0.15 + T(1990)), so the 0.15 has an important effect. Climate sensitivity has an effect too (in MAGICC) since this influences T(1990) -- a model artifact. (The TAR says that the 0.15 comes from Zuo and Oerlemans, which I have added to the refs.)

From this [http://di2.nu/foia/foia2011/attachments/NewGSIC.doc] document:

[Tom WIgley] This method was only meant to be applied out to 2100, and it clearly gives unrealistic results if carried too far beyond 2100. The reason is that, with the quadratic correction factor, gs must have an upper bound. This occurs when dgs/dgu = 0 (i.e., gu = 40.09cm) at which point gs = 18.72cm. The available amount of GSIC ice that could melt is almost certainly greater than this. In any event, interpreting this number as the available amount of GSIC ice would be inappropriate, since the available amount of ice should be an independent input, not an artifact of an empirical correction for area changes.

For all credible future scenarios out to 2100, gu never approaches 40.09cm, so the upper bound for gs is never reached, and the TAR method can be applied with impunity. For stabilization scenarios, however, which may extend well beyond 2100, the TAR method can lead to ridiculous results, so some alternative must be developed.

Fortunately, because the TAR method is strictly empirical and gives results that are subject to considerable uncertainty, we can be fairly cavalier in devising an extension to it.

3 comments:

That is a normal question for a scientist to ask, theoretical or experimental -- what is a reasonable answer or what is a realistic range of measurement? Maybe not really scientists, these guys were computer jockeys riding a difficult simulation, so they are asking similar things: where should the computation land, how should it be seeded with realistic initial values? In a way, he is admitting that hockey sticks are EASY to generate with these complex programs.

But where they erred was in selling a spiking hockey graph as realistic. Any other area, scientists would maybe smell a rat, looking at a graph like that.

HARRY_READ_ME was dealing with the same thing. It was not working out.

I really don't see the problem with this email. You are pointing at something taken completely out of context and arguing that what they say has some terrible underlying meaning. They're working on a simulation that projects events, and saying that it works through 2100, but they need something better for longer projections. No big deal.