Can 2°C warming be avoided?

Yesterday’s BBC article on the “Avoiding Dangerous Climate Change” report of the Exeter meeting last year, carried two messages that have left some a little confused. On the one hand, it said that a stabilization of greenhouse gases at 400-450 ppm CO2-equivalent concentrations is required to keep global mean warming below 2°C, which in turn is assumed to be necessary to avoid ‘dangerous’ climate change. On the other hand, people are cited saying that “We’re going to be at 400 ppm in 10 years’ time”.

So given that we will exceed 400 ppm CO2 in the near future, is a target of 2°C feasible? To make a long story short: the answer is yes.

The following paragraphs attempt to shed a little light on the background on why 2°C and 400 ppm are mentioned together. First of all, ‘CO2-equivalent concentration’ expresses the radiative forcing effect of all human-induced greenhouse gases and aerosols as if we only changed CO2 concentrations. We use that as shorthand for the net human perturbation – it’s not the same as real CO2 being at 400 ppm because of the substantial cooling effect from aerosols. However, the other greenhouse gases such as methane and N2O increase the forcing and compensate somewhat for the aerosol effects. Thus the CO2-equivalent concentration is roughly equal to current levels of real CO2.

The ecosystems on our planet are a little like a cat in an oven. We control the heating (greenhouse gas concentrations) and the cat responds to that temperature. So far, we have turned the controls to a medium level, and the oven is still warming-up. If we keep the oven control at today’s medium level, the cat will warm beyond today’s slight fever of 0.8°C. And if we crank the control up a bit further over the next ten years and leave it there, the fever is going to get a little worse – and there is even a 4 in 1 chance that it could exceed 2°C.

Where does that probability come from? If we knew the real climate sensitivity, then we would know the equilibrium warming of our oven/planet if we left the CO2-equivalent concentrations at, say, 400 ppm for a long time. For instance, if the climate sensitivity were 3.8°C, such an oven with its control set to 400 ppm would warm 2°C in the long-term1. But what are the odds that the climate sensitivity is actually 3.8°C or higher? The chances are roughly 20%, if one assumes the conventional IPCC 1.5-4.5°C uncertainty range is the 80% confidence interval of a lognormal distribution2. Thus, if we want to avoid a 2°C warming with a 4:1 chance, we have to limit the greenhouse gas concentrations to something that is equivalent to 400 ppm CO2 concentrations or below. (Note, though, that one might want to question, whether a chance of 4:1 is comforting enough for avoiding fever of 2°C or more…).

At the heart of the second statement (“We’re going to be at 400 ppm in 10 years’ time”) is the fair judgment that we seem committed to crank up concentrations not only to 400 ppm CO2 but beyond: anything less implies we would have to switch off our power plants tomorrow. Current CO2 concentrations are already about 380ppm and they rose about 20ppm over the last ten years.

Figure: A schematic representation of (a) fossil CO2 emissions, (b) CO2-equivalent concentrations, and (c) global mean temperature for two scenarios: Firstly, an “immediate stabilization” which implies rising CO2-equivalent concentrations up to around 415 ppm in 2015 and stable levels after that (red dashed line). This scenario is clearly hypothetical as the implied emission reductions in 2015 and beyond would hardly be feasible. Secondly, a peaking scenario (green solid line), which temporarily exceeds and then returns to a 415 ppm stabilization level. Both scenarios manage to stay below a 2°C target – for a climate sensitivity of 3.8°C or lower. This is roughly equivalent to a 4:1 chance of staying below 2°C3.

Indeed, avoiding concentrations of – say – 475ppm CO2 equivalent (see Figure a) will require quite significant reductions in emissions. However, as long as we reduce emissions decisively enough, concentrations in the atmosphere could lower again towards the latter half of the 21st century and beyond. The reasons are the relatively short lifetimes of methane, nitrous oxide, other greenhouse gases and some CO2 uptake by the oceans (as discussed here).

Now, what is going to happen to our cat, if we turn up the heat control of our oven to about 475 ppm and then reduce it again? If we react quickly enough, we might be able to save the cat from some irreversible consequences. In other words, if concentrations are lowered fast enough after peaking at 475 ppm, the temperature might not exceed 2°C. Basically, the thermal inertia of the climate system will shave off the temperature peak that the cat would otherwise have felt if the oven temperature reacted immediately to our control button.

Thus, to sum up: Even under the very likely scenario that we exceed 400 ppm CO2 concentrations in the very near future, it seems likely that temperatures could be limited to below 2°C with a 4:1 chance, if emissions are reduced fast enough to peak at 475 ppm CO2 equivalent, before sliding back to 400 ppm CO2-equivalent.

Peaking at 475 ppm CO2-equivalent concentrations and returning to 400 ppm certainly comes at a cost, though: Our oven will approach 2°C more quickly compared to a (hypothetical) scenario where we halt the build-up of CO2 concentrations at 400 ppm (see Figure c). Thus decisions would need to be different if we care more about the rate of warming than the equilibrium. In fact, some emission pathways and some models also suggest that peaking at 475 ppm CO2 equivalent and returning to 400 ppm might even slightly decrease our chances to stay below 2°C (see Chapter 28 in the DEFRA report). Depending on the actual thermal inertia of the climate system, the peak temperature corresponding 475 ppm might be very close to the 400 ppm equilibrium temperature. This points to a more fundamental issue: Rather than discussing ultimate stabilization levels, it might be more worthwhile for policy and science to focus on the peak level of greenhouse gas concentrations. By the time that we managed to peak concentrations we could still decide whether we want to stabilize at 400 ppm or closer to pre-industrial levels. We will most likely be able to make wiser decisions in the future given that we certainly have learnt something about the cat’s behavior in its current fever.1. The equilibrium warming dT can be easily estimated from the CO2 equivalent stabilization level C, if one would know the climate sensitivity S, with the following little formula: dT=S*ln(C/278 ppm)/ln(2)

2. This is of course a probability that merely reflects the uncertainty in our knowledge. The climate sensitivity is not random, it is just unknown. It expresses our degree of belief in that outcome, and is of course subject to future modification in the light of new evidence.

3. The depicted peaking scenario is the EQW-S475-P400 scenarios as presented in Chapter 28 of the DEFRA report. The “combined constraint” (see as well Chapter 28) has been chosen to find aerosol forcing and ocean diffusivity values for a 3.8°C climate sensitivity, which allow an approximate match to historic temperature and ocean heat uptake records. The historic fossil CO2 emission data is taken from Marland et al., the CO2 observations from Etheridge et al. and others are as given here and here, and the temperature observations and their uncertainties are from Jones, Folland et al.. The simple climate model that was used is MAGICC 4.1 as in Wigley and Raper (2001 – see here).

68 Responses to “Can 2°C warming be avoided?”

Can 2°C warming be avoided?
Oh definitely, considering that transient climate sensitivity is about 1 K/2xCO2 and that the sink is proportional to concentration, means that in the most likely emission scenario of intensified nuclear power generation combined with new technology, the warming won’t even reach 1 degree, without much effort.

Does this take into account the positive feedbacks, some of which are already being triggered? Someone made the argument in the last piece that if Arctic ice is melting now, at current temperatures, it would be expected to continue melting (though at a slower rate than if we really jacked up the heat). Same with permafrost. So there’s the reducing albedo & melting methane clathrates to consider, even at current temps. Also, I understand that at some point of warming, flora might actually become so heat-stressed (or killed) that even with increasing CO2 “fertilization” their net global CO2 intake might start decreasing.

You wrote: “But what are the odds that the climate sensitivity is actually 3.8Â°C or higher? The chances are roughly 20%…”

I’m thinking 20% is worse than playing Russian roulette with a six-shooter.

Considering that the world has warmed 0.8 degrees C from an increase in CO2 of 280ppm to 380ppm, the transient sensitivity for 2xCO2 would be 1.8 degrees C. Unless, of course, the denialists can come up with an enormous mysterious heat source that no-one has yet identified. (BTW, during the current transient, the oceans are taking up nearly half the CO2 thermal forcing.)

“Considering that the world has warmed 0.8 degrees C from an increase in CO2 of 280ppm to 380ppm, the transient sensitivity for 2xCO2 would be 1.8 degrees C. Unless, of course, the denialists can come up with an enormous mysterious heat source that no-one has yet identified.”

“[Response: One impact of the PETM was an extinction of calcifying organisms in the ocean. The amount of carbon released has recently been estimated to be comparable to the fossil fuel inventory, 5000 Gton C, based on the dissolution of CaCO3 on the sea floor. Recovery of the temperature took 100,000 years or so. The analogy is apt. David]”

Looking at (a) in http://www.realclimate.org/co2_stab.jpg shows a peak in CO2 emissions in the year 2010. Thus I take the article to mean that unless there are massive reductions in CO2 emissions beginning in the year 2011, then the chance of avoiding a greater than 2 degrees C increase falls below 80%. Everyone can make their own judgement about the likelihood of massive reductions of CO2 emissions beginning in 2011.

> By the time that we managed to peak concentrations we could still decide
> whether we want to stabilize at 400 ppm or closer to pre-industrial
> levels.

I agree with this. I think we should target 300 ppm as the long term upper bound for CO2 – we know this works for the Earth system. Any higher and we are playing Russian roulette again.

We need to start taking large amounts of carbon out of the air. One very good way to get this going with positive environmental effects if managed properly is to grow biomass then char it and use the elemental carbon to mix in large quantities deep into soil as ‘terra preta’ or Amazonian dark earths. This has major soil conditioning properties (ie. reducing conventional fetiliser need by 50%)http://www.css.cornell.edu/faculty/lehmann/terra_preta/TerraPretahome.htm

The question of positive feedback is interesting to me as we are seeing some in Northern New England this year. Even though we’ve gotten almost 40″ of snow since October, with all the rain and record warmth we’ve gotten, much of this area has been snow free for January.

The normal January diurnal temperature range here seems to be about 20 deg, but with the lack of snowpack we’re seeing 30 to 40 degree ranges. If we have less snowpack over the Northern Tier of the US on a permanent basis, I’d bet high temperatures, especially towards February, will be more than 2 deg C higher than past normals just through solar warming of dark surfaces.

All the models are statistical models naturally but does anyone know of any comparisons with other public models. For example, we as a society take action on smoking because it decreases the probability of lung cancer by ‘x’ or we reduce asbestos use because it decreases the risk of asbestosis by ‘y’ .

I agree with Lynn that the odds being used look pretty poor even for someone who wishes to risk suicide and Philip Sutton’s point is OK but why 300ppm?

This is a scientific question with I suspect a scientific answer which has been dealt with elsewhere but I would rather not make decisions about my children’s and grandchildren’s future on the basis of one UK Gov adviser and one UK Gov minister who each decided that doing certain things were politically impossible. For whom I ask?

If zero growth and a massive reduction in the burning of fossil fuels improves our survival chances by ‘z’ then where can I get the numbers please anyone?

Point 11 begins with an error. “All the models are statistical models naturally.”

The models generally used to project climate into the future are physical models, not statistical models.

That said, the question is well taken. Fuzzy information is better than no information, so we ought to be able to make some more formal estimates of what the best course of action would be. The models of what we ought to do rather than on how the climate will behave really ought to be statistical.

However, there are assumptions held by certain fields than may be inappropriate. Economists in particular apply a “discount rate” to what they consider rational decision making. This may bias the results in favor of time scales less than a decade.

Only monetized resources are considered by economists. A policy leading to the end of all mulitcelled life forms in a thousand years would be evaluated as costing as much as the expected total wealth of humanity deflated over a thousand years. In this extreme hypothetical case, the presumption that you could make up the difference by investing in alternative growth portfolios is obviously absurd.

What seems to be at issue is the changes we are committing the world to for decades, centuries, perhaps even millenia into the future. What are the costs of action and inaction, and where is the optimum behavior based on current knowledge?

We don’t actually have any agreement on how to set that problem up formally. This is why I think that the numbers that Eachran seeks don’t exist.

Whilst the complexity of the argument here is impressive would it be possible for you to prepare a “dummies guide” to key facts. I understood (after about an hour of skimming posts) the following from this site.

1. Your accepted mid range reasonable scenario is ~3.6 deg C of warming over the next 100 years.

2. Human activity is responsible for ~30% of the rise in CO2 which in turn means humans are responsible for ~3-8% of the average prediction of ~3.6 deg C noted above?

3. Urban heat effect on measuring points have been considered and factored out of climate predictions.

4. The hockey stick has been validated by more than 1 group of climate scientists.

A separate short summary of key facts would be appreciated. thanks :)

[Response: 1) or thereabouts in the absence of any reductions in emissions because of concerns about climate, 2) first half yes, second half no. (correction): Humans are responsible for all of the 30% rise in CO2, and all of the projected 3.6 deg C (or whatever) increase is from human-related forcings (since we cannot predict changes in solar or volcanic activity in the future). 3) mostly yes., 4) yes. Your idea for a summary post is a good one though. – gavin]

Thanks Michael point 12. I understand the point about physical models but their fits to reality are statistical and there are also a number of models. It was in that sense that I was using the word – sorry not to be precise.

I understand cost benefit analysis as do probably all of the site visitors but I get no “feel” (sorry not a scientific term) for it here. You know the sort of thing : a pre-funeral funeral party paid for out of ones remaining assets or investment in new towns on higher ground, barrages for tidal and tempest surges, mass population movements, the expansion of agriculture on fertile but currently unworked land, habitable protective bubbles, the expansion of public communication networks and a never ending list of changes to the way we live together. From an economic point of view with discount rates currently at a low level perhaps the calculation is not to horrifying : except what is the price of extinction ? Perhaps we should just do a back of the envelope, ballpark calculation.

If I read 300ppm or 400ppm and our scientist friends tell us that’s OK I would like to know in what sense it is OK and why the decision to become decision makers on the “magic” number.

If the Egyptians could build the pyramids than I am pretty sure that we can crack this particular problem. There are no limits (literally) to human creativity as well as no limits to our destructive power except extinction and I am not too happy about following that particular route.

I think Gavin either misread point 2 or saw through a poor expression to the real point: (2) states “Human activity is responsible for ~30% of the rise in CO2” and Gavin agrees for the first half of the century.

I think Gavin is describing the temperature trend, not the CO2 trend and sagenz is incorrect. 100% of the CO2 rise is the result of human activities, in fact there is less CO2 in the air than we expect due to some natural (and AIUI unknown) sinks that are ameliorating things a bit.

What seems to be at issue is the changes we are committing the world to for decades, centuries, perhaps even millenia into the future. What are the costs of action and inaction, and where is the optimum behavior based on current knowledge?

If I may add to Michael’s comment without appearing to disagree, it is not only the changes to the natural world we are causing, but also to an important component of the natural world – the human built environment.

Societies have a lot of momentum and thus take a long time to turn. The longer we quibble about action, the more resources we commit to societal infrastructure that may be inappropriate in a low-carbon future; this may necessitate further action to ameliorate, at additional cost.

So when we talk about future costs, we must consider the additional cost to society in the action delay. An honest accounting will include a factor for infrastructure replacement due to the delay. Optimum behavior will include nonmonetized accounting for equity as well as monetized accounting for new infrastructure.

Re: #13
I second the idea of a “summary post”! Great for giving us interested laypeople a quick and easy overview of what the consensus understanding currently is across the breadth of climate science. Ideas:

1. Based on an updated and highly simplified IPCC Assessment?

2. With a pointer as to the robustness of the conclusions?

3. Summary of changes expected in various parts of the earth according to climate models, linked to observed changes occurring now (e.g. high latitude warming)

4. Simple “myth-buster” section.

5. How do new studies fit in and how much do they shift/strengthen/weaken the “consensus” in the relevant area?

6. Specific links for more detailed information

I know it sounds demanding, but I really am talking about simple, reliable information, as shown in #13 Response. A “one stop shop” would be infinitely easier than trawling the net trying to pinpoint information. I’m sure such a facility would be welcomed by a whole new section of the public. Its strength would be, that as truly representative of the understanding of climate science community, it could be used as a reliable reference with confidence – badly needed now that self-appointed climate “experts” seem to be two-a-penny!

On the subject of myth busting, particularly the myth that it’s all down to human activity, here’s a link. Not about CO2 but I think might be recommended reading for all catastrophists.http://www.royalsoc.ac.uk/page.asp?id=1553
It’s the Royal Society – and what would they know that doesn’t fit into your average eco-warrior’s agenda.

> Does this take into account the positive feedbacks, some of which are already being triggered? So there’s the (1) reducing albedo & (2) melting methane clathrates to consider, even at current temps. Also, (3) I understand that at some point of warming, flora might actually become so heat-stressed (or killed) that even with increasing CO2 “fertilization” their net global CO2 intake might start decreasing.

The brief answers here are: (1) Yes, (2) no and (3) to some degree.

(1) Firstly, reducing albedo from changing ice and snow cover is summarized with a lot of other feedbacks in the (uncertain) “climate sensitivity” parameter of the used simple climate model. If it turns out that the albedo decreases more than previously anticipated due to retreating ice and snow cover in a warmer world – than this would point towards a higher likelihood of a higher climate sensitivity. Thus, the forcing of a 400ppm CO2 equivalent stabilization would lead to a 2°C warming with a higher probability.

(2) Methane releases from ocean or permafrost methane hydrates are not taken into account in this simple model calculation. Obviously, it would be worthwhile incorporating this effect in the future. Basically, what it would do is that we were allowed to emit even less greenhouse gas emissions in order to offset the extra methane emissions from the warming oceans and/or the thawing permafrost – assuming we wanted to follow the same concentration pathway. However, the main reason, why this effect hasn’t been incorporated so far is that it seemed a little hard to come up with a (probability density) function for how much extra methane emissions from hydrates can be expected for what warming (see expert blog on methane hydrates http://www.realclimate.org/index.php?p=227). Work in progress, so to say.

(3) The temperature feedback on the carbon cycle is included in the above calculations to some degree, by having simply assumed the default values as in IPCC TAR. As for the methane releases, a changed assumption in this effect would not make a difference on the shown relation between a 400ppm stabilization and 2°C equilibrium temperature. However, it would make a difference in respect to how much greenhouse gases we can emit for following a particular concentration pathway.

> What evidence is there that the calculations of climate sensitivity can be described by a lognormal distribution?

In short, there is not much – other than the fact that the skewed shape of a lognormal distribution with its longer tail for higher values roughly represents the shape of different – observationally constrained – uncertainty distributions for the climate sensitivity S. There is as well some more fundamental reason for such a skewed shape (not specifically for a lognormal, though): observables, such as our transient 20th century temperature observations, that we can use to constrain climate sensitivity, scale approximately with 1/S not S (see presentation by Myles Allen here http://www.stabilisation2005.com/day2/allen.pdf ). Thus, if our observables and their uncertainties are approximately distributed like a symmetric normal distribution, then our likelihood estimate for climate sensitivity will necessarily be skewed with a long tail for high climate sensitivities. See differently constrained climate sensitivity PDFs which all share the same basic shape, e.g. in figures 28.1 or 29.1 of the DEFRA report – http://www.defra.gov.uk/environment/climatechange/internat/dangerous-cc.htm ).

By the way, there is another danger in defrosted permafrost besides the release of methane hydrates. If all the permafrost in Russian and Canada were to dry out, the peat it is composed of would be subject to catching fire, and virually impossible to extinguish. Burning it all would double the amount of CO2 in the atmosphere. According to “The Great Global Experiment,” an article published by Harvard Magazine covering research by various professors on global warming:

In boreal forests, growing seasons are short, there is very little rainfall, nutrients are few, and the trees don’t get very big. But underlying the forest there is a lot of peat, the remains of moss that has been building up for 5,000 to 7,000 years. The moss grows slowly, but it accumulates because the soils tend to be saturated with water or to be frozen as part of a discontinuous permafrost. “We call this a cold desert,” says Wofsy, “because it gets only about 11 inches of rainfall a year, but the climate is cold enough that evaporation is even less.” The combination of cold and wet preserves the peat. Recently, however, the climate has warmed in this area, and there is a good chance that the peat is no longer stable over the long term.

Why should that matter? “If you took all the peat in Canada and Russia and turned it into CO2” by burning, Wofsy says, “you would double the amount of CO2 in the atmosphere. It took 5,000 years to make it, but it doesn’t take much to get rid of it,” because it can catch fire. Intense conflagrations burn all of the dried peat in a forest. “Usually, just a foot or so of peat is dry enough to burn, but if all two meters dry out as climate warming trends continue,” he cautions, “the full accumulation could be released to the atmosphere over perhaps 50 years.”

” … prehistoric coal fires occurred predominantly during the Pliocene-Pleistocene interglacial periods (2.5-1.8 Ma), in the mean and high latitudes of Asia and North America. These fires occurred in areas characterized by rugged relief and steeply dipping coal-bearing strata. Combustion in the presence of fuel and oxygen occurred within 500 m of the surface. Prior to the Quaternary, numerous fossil-fuel horizons were isolated from atmospheric oxygen. Subsequently, tectonic activation and uplift resulted in caustobiolith oxidation and combustion. Pyrogenic landscapes, up to 1000 km2 in area, are a distinctive feature of many coal and oil basins.”

If you create a one-pager for the layman on global warming, you might also want to include information on possible solutions. Nothing makes people stick their heads back in the sand faster than feeling like there is no solution anyway.

There is no one propagating a “myth” that *all* climate change is of human origin as #20 suggests. This is a pretty weak strawman argument, though surprisingly common. It is (often quite rudely) expressed something like “where were the SUVs at the PETM, you (ascription of malign political motives) (nasty epithet)”?

There is a sort of provocative statement lurking in the referenced article, otherwise a rather poorly organized journalistic piece about a paleontologist.

A new line of thinking, the Astronomical Theory, suggests that long-term climate change is more likely to be driven by the Earths position in relation to the sun, rather than the human activity often linked to current global warming.

This sentence propounding the “Astronomical Theory” as some sort of alternative hypothesis to anthropogenic global warming was written by someone who had never heard of Milankovic, not by an actual paleoclimatologist. Interested novices are referred to http://www.ncdc.noaa.gov/paleo/milankovitch.html . You will see that the “new line of thinking” dates to 1920.

The sentence should have been edited to read something like “Long-term climate change is believed to be largely driven by shifts in the Earths position orbit around the sun, although human activity is often linked to the much more rapid contemporary global warming.” The confused reporter didn’t get the connection, which might have been something like “However, these large past shifts provide independent tests of the validity of climate models, and so have a direct bearing on the aspects of climate science that are of great policy interest. However, this is all rather off topic for Dr. Schreve’s work, and should probably not appear in the bio at all.

‘Therefore, the probable effect of human-injected carbon dioxide is a miniscule 0.12% of the greenhouse warming’. That’s just 0.03*0.0365 of course – but even that is calculated wrong (it should be 0.11% by my calculator). But from our numbers, it would be between 3 and 8%.

How does that statement square with your response above that human activity is responsible for all of the 3.6 deg rise?

Now I definitely need that summary.

[Response: That was 3 to 8% of the total greenhouse effect today (i.e. of the ~33 deg C) – the responsibility for the rise in the future projections will be 100% human. -gavin]

Re 27: ‘There is no one propagating a “myth” that *all* climate change is of human origin”

There was no one, who knew anything about it, propagating the myth that we could be hit by Iraq’s weapons of mass destruction or that Iraq was well on the way to building a nuclear weapon. What the “experts” and governments did was to keep quiet when the media and alarmists said it was so.

No one, whose only input on this subject is the papers, radio and TV, and that must be the majority, hears it said very often that the consensus is that a significant part of warming is natural. Nor are the huge uncertainties emphasised.

Back in #14, Eachran said “If the Egyptians could build the pyramids then I am pretty sure that we can crack this particular problem.”

Actually that is an appropriate analogy because they were not in the main built by slave labour but by a willing population convinced of the need to do it by the best “experts” of the day, who to be fair were amazingly advanced in many area of science and engineering. Most of us now know they were wrong not only in the basic idea but in a few of the details such as tossing out the brain as a not an unimportant part of our anatomy.

Incidentally sea levels were likely a metre or so higher in those days.

[Response: Two points. Let’s not get involved in arguing about analogies – stick to the actual case at hand. And secondly, global sea level was not a meter higher 4000 years ago. -gavin]

Post 17 many thanks Steve Bloom for identifying the one page source (not really one page but not far off) it is very well written and has lots of useful links including on the economics of climate change by the Pew Centre which helps me on my previous questions.

I think that Stanford along with wikipedia, this site, the IPCC reports, the links, and the usual well informed journals provide sufficient information for me to make informed judgments.

The problem I have with another one pager is that one merely re-hashes someone else’s work which with a little effort is not too difficult to understand in any event.

I do think that for informed opinion and as recommended by the person at Stanford, we need to get some numbers for the costs/benefits of adaption and doing nothing to prevent CO2 emissions, and in addition the costs/benefits of doing something like returning the CO2 levels to their pre-industrial state, and maybe also another somewhere in the middle. Policy makers are unlikely to do anything without numbers. Just like Mr Hansen who in one of his papers admits to being a ‘data’ person. I am too – but I am not a policy maker except for myself and sometimes my family and friends.

I think that for policy makers the problem seems so vast and irresolvable – they are not, in general, scientists or mathematicians versed in model making. They dont have the background to make rational decisions on complicated papers on climate change. It isnt really their scene.

No one, whose only input on this subject is the papers, radio and TV, and that must be the majority, hears it said very often that the consensus is that a significant part of warming is natural. Nor are the huge uncertainties emphasised.

I said that “There is no one propagating a “myth” that *all* climate change is of human origin.”

Global mean temperature rise since 1950 is not *all* climate change.

It *is* likely that all or nearly all the global mean temperature increase from 1950 is anthropogenic, and it is nearly certain that at least the dominant part is anthropogenic.

This repeats the confusion between the entire climate record and the recent change, which seems to be emerging as a standard technique of obfuscation.

It’s never easy to tell who are the confusers and who the confused, so I try not to blame individuals, but at best #31 shows signs of exactly the confusion I referred to in #27.

This ties in with my complaint about casually accepting the nomenclature “global warming”. It’s too slippery and facilitates this confusion.

===

Something like:

Q Are humans responsible for global warming?

A Yes, almost certainly.

Q All of it?

A Probably, pretty much all of it or nearly all.

Q Was there global warming at the Paleocene/Eocene Boundary?”

A Yes, it was quite spectacular.

Q Were humans responsible for that?

A No, of course not.

Q But you just said that humans are probably responsible for all of global warming! You people need to get your act together before you bother me any more with big policy recommendations.

==

Or, as in:

Nothing is better than eternity in heaven.

A grilled cheese sandwich is better than nothing.

Therefore a grilled cheese sandwich is better than eternity in heaven.

==

The reason to avoid the expression “global warming” in conversation with the public is because of the danger that its meaning will shift in the middle of the conversation. Even “climate change” is now being subjected to the same treatment (see #31).

Nomenclature is important. Contextual naming, perfectly adequate for well-intentioned conversation between well-informed people, is a powerful weapon in the hands of obfuscators.

Maybe we need another “sensitivity” stat (if it doesn’t exist now) that would be more anthropocentric. It would give some idea (at least a guesstimate) of how much GHGs are released by nature due to the warming caused by human emissions, and estimate the total warming from those human emissions. Say we emit 1 unit, which causes nature to emit 3 units, and those 4 units together cause A amount of warming, then the sensitivity would be A for every 1 unit human emissions. But it’s more complicated, because the nature units per human units would get greater & greater as we warmed more & more. And the some of the human units (along with nature’s) would stay in the atmosphere a very long time….continuing to warm, cauing more & more releases of nature’s units, causing more warming….

“Thus, to sum up: Even under the very likely scenario that we exceed 400 ppm CO2 concentrations in the very near future, it seems likely that temperatures could be limited to below 2°C with a 4:1 chance, if emissions are reduced fast enough to peak at 475 ppm CO2 equivalent, before sliding back to 400 ppm CO2-equivalent.”

Has someone laid out a scenario for how we could peak at 475 ppm? This is primarily a social problem in the sense that we have to figure out how to deploy our technological fixes within our existing social frameworks. Is it possible to do without nuclear? Do new technologies have to come online to make it possible, and if so how soon? To me, this is the scary part: can we really get our act together and put all the pieces in place in the required time frame?

A question please. As I understand it, carbon emissions directly correlate with world GNP. At some point, climate change will start driving world GNP down as flooding, drought, storms and crop-loss cause economic disruption. If global warming was responsible in some part for Hurricane Katrina and/or Rita, can one already credit some carbon emissions reduction to global warming? I imagine this would be hard to model accurately, but at what point does climate change-caused economic disruption start to force global carbon emissions down?

Re 33, Michael, which bit of your ‘There is no one propagating a “myth” that *all* climate change is of human origin’ did I mistake or misconstrue?

Show me one of the last weeks UK scare stories that said they were only talking about the temperature rise since 1950 and that every one agrees the rise from 1850 to 1940 and the fall from 1940 to 1950 were not caused by human greenhouse gas emissions.

You say, ‘Even “climate change” is now being subjected to the same treatment (see #31)’. What treatment? Where did I say “climate change”?

Gavin, you say ‘and secondly, global sea level was not a meter higher 4000 years ago’. I note you do not include the IPCC “likely”. Is Wikipedia now the ultimate authority on sea levels as well? What happened to our Isle of Ely? Why has our Wash shrunk? How come Troy and Ephesus are now so far from the sea? The Egyptian analogy was not mine but, since someone else raised it and you mentioned sea levels, how did they build their canal 4000 years before we did and which was in use till Roman times, if sea levels were not significantly higher?

Re 36:”Nomenclature is important”
Thomas, your choice words is ironic since if you plot concrete production (and to be fair asphalt, McDonalds etc) you get a super Hockey Stick. The vast majority of concrete (and many other things) ever made was made after 1950. Not only does concrete production cause massive CO2 emissions, it also makes urban heat islands.

#38: David H – Just quickly about ‘the Wash’, I’m assuming you’re referring to the area of the UK near Peterborough. In this area the land has actually been sinking as a result of the ice cap retreating from northern England. This is known as ‘isostatic rebound’ and is a well-established mechanism for sea-level changes in particular localities to vary differently to *global* trends. Similarly, you will find areas of the globe which have been pushed up because of the collision of two continental plates. Perhaps this is happening in the Middle East, being on the edge of the Indian plate which is pushing north?

#22 (3): As I understand it the vast majority of models used in the IPCC TAR didn’t include an interactive carbon cycle, and the Cox et al (2000) landmark paper on the positive feedback from plants/soil carbon would only have just been published in time for being included in the drafting process of the IPCC TAR [which is now ongoing for AR4]. As far as I can tell the “default value” for this in the TAR would therefore be 0 [In fact there appears to be a whole section on considering uncoupled land carbon models in which none of them see the land becoming a net source for CO2 emissions, in contrast to the Cox et al (2000) result].

However, I also thought that the land only became a net source of CO2 when global temperatures had risen >2 degrees. We therefore reach the limits of the concept of ‘climate sensitivity’ as it is a linear concept and the climate system is non-linear.

To give another example, once the polar ice caps have melted there will be no more contribution to the climate sensitivity due to the ice-albedo feedback. So, assuming all other things are equal [which they won’t be!], climate sensitivty woudl decrease once the icecaps have melted. In fact some models have higher climate sensitivities because they have more sea-ice in their ‘control’ state [and vice-versa].

I imagine that the publication of AR4 would be a good time to summarise the “state of play” in climate science

I am new to this site, but wonder if there has been any reference to a paper from the Royal Swedish Academy of Sciences (2005) that will never be mentioned by the IPCC, on the basis that any scepticism must have been paid for by Exxon.

The author concludes that the records would indicate a cooling climate in the Arctic, and that while this does not prove that there is no global warming, “the observations available give no proof of such a warming”.

Re #1
There is massive inertia in the global commitment to continued burning of hydrocarbons. Why is simple – many decades of investment in existing plant and equipment. The only available alternative that permits life style as usual, the preferred choice of the vast majority of people, would be large-scale increases in nuclear generation. But new plants require many years to enter service, design of a safe new generation is not complete, construction has yet to be started and political barriers still stand. A shift to biofuels, wind and solar is theoretically possible, emphasis on theoretical.

Meanwhile the large scale burning of coal for electrical generation in China, India and the USA, to list the main current and future culprits, continues to increase steadily, without any practical large-scale CO2 sequestration method having been demonstrated. Again, theoretically available sequestration methods would in any case require huge construction programs that would take decades. The cost is estimated to be substantial. (Nowhere near the long term costs from global warming, but so what, up front costs are still a barrier.)

It is painfully obvious, except in the minds of people disconnected from social, fiscal and industrial reality, that the likelihood of stabilizing CO2 levels in the atmosphere before levels pass through the 450 PPM level on their way up is essentially zero.

Positive feedback loops are already in evidence and can only be expected to intensify.

Hence James Lovelock’s pessimistic outlook. I wish I could find reason to suppose he is wrong.

Re #41: Tim, I haven’t been able to find a way to read that article as a non-subscriber to the journal so I can’t comment on its content in detail although it does sound to me from your brief description that its conclusions in regards to arctic temperatures are in contradiction to many, many other studies.

Do you have any evidence for your claim that the IPCC does not mention any skeptical studies because they presume they are funded by Exxon? Perhaps what you really object to is how the vast preponderance of the scientific evidence points in a way that you don’t like, so you would prefer if the IPCC just ignored all of evidence and focussed on the few studies that point in the direction that you like?

By the way, does your “conspiracy theory” in regards to the IPCC extend to the U.S. National Academy of Sciences and to the councils of the American Meteorological Society and the American Geophysical Union, and the editors of Science and Nature…and even major energy corporations such as BP and Shell or major auto manufacturers such as Ford? All of these have endorsed the view that anthropogenic climate change is a serious concern.

Re: #38 “What happened to our Isle of Ely?”
Having lived many years not 10 miles from Ely, it’s obvious to anyone driving south along the A10 that there are three levels – the highest, the water in the River Ouse to the right, next the level of the road, and below, to the left, the drained area of the black fen. Without knowing anything about rising/falling land/sea levels, I think the answer to your question is: “They drained the Fens”

P.S. Stunning views of the ancient water courses superimposed on present day fields can be seen on Google Earth…

Bjorn Lomborg went through some of the cost benefit analysis for fixing global warming in the skeptical environmentalist. I can’t say whether it was accurate or not, but the gist of what he said was:

There will be a significant price to pay for global warming (all of the costs assinged a monetary value and added together e.g. building flood defences, effect on agriculture, heat deaths etc. though I don’t know how you calculate the cost of killing other organisms)

However,

The cost a drastic measures to “fix” global warming will probably cost more.

He came up with a ideal scenario from a study that he quoted which costs as little as possible. This involved small decreases in emissions starting now with larger ones starting in about 50 years when renewable energy becomes economically viable.

You would need an economist to comment on the reliability of the studies he quotes.

I will pass on the Wash, but Ely is presently some 36 metres above sea level at its highest point and its Cathedral is on ground about 24, so even allowing for isostatic rebound it was probably high ground 4000 years ago. The Fens are now drained and pumped like many low lying areas but it is hard to believe that Ely’s name and strategic importance did not derive from its earlier island status, just as history tells us that Troy once controlled the Dardanelles. I know it is inconvenient for this debate but the literature has conclusions of warmer times and higher seas that were reached by researchers of diverse disciplines with no interest in AGW before the theory prescribed what is now acceptable.

I will not suggest its science but I find it hard to think of how and why 5000 years ago any one would be building the Ring o’ Brodgar on Orkney if it was not a much more pleasant place to be than it now is. If positive feedback is what drives us out of ice ages an overshoot to higher temperatures and sea levels would not seem unreasonable unless in some way the system is well damped.

From what I remember of reading the Avoiding Dangerous Climate Change papers a year ago the total CO2 emissions are more important than the rate. Which is basically what all these discussions of peaking scenarios is saying; without drastic cuts in emissions so that the removal rate is higher then there will not be a peak.

So how likley are the total emissions to be kept at a level not much higher than have occurred so far? Not very. There are still lots of fossil fuels in the ground oil is reaching its half way point, gas is less than half depleted and there is still lots more coal, not to mention tar sands, oil shales, methane hydrates and peat. As the amount of fossil fuels left in the ground goes down below 50% the price will rise and the incentives for extraction increase. Therefore we can be pretty sure that all the technically recoverable fossil fuel reserves will be extracted and used.

It tends to be the harder to extract reserves that are left, these cost more in capital costs to develop, so there is a strong desire to maximise short term revenues. Recent oil discoveries in deep water have a much shorter life than those on land a few decades ago for this reason.

High capital costs are the reason that the demand side will be slow to react. Coal fired power stations have a 30+ year life so to build new ones before they have paid for themselves is enormously expensive and modifications for sequestration are reportedly very difficult in existing power stations. The hydrogen economy is in my opinion not technically the best solution, but even if it were, the gigantic infrastructure changes required would take decades.

Putting this all together I would expect future emissions to be at least twice past emissions and that they will occur in the near future and not be spread out over centuries. Economic incentives for extraction and use are very high and great political will is needed to make reductions in the carbon intensity of GDP above about 2% a year. To stand any realistic chance of limiting CO2 to 475ppm we need to reduce the carbon intensity of GDP by about 7% a year.

Recent Global Warming: An Artifact of a Too-Short Temperature Record?
INTRODUCTION
Although the magnitude of the greenhouse effect has been of major concern during the
last decades, the reality of the processes involved has hardly been discussed. It has
become common to base planning on predictions that indicate a major warming.
Assumptions concerning the future have been repeated at numerous occasions and
are reflected in a number of statements in the recent report, “Impacts of a warming
Arctic” (1). The warming of the Arctic has become an important issue, because the
prediction is that changes will be strongest and first noticeable in the Arctic and
because of the undesirable environmental impact that might accompany the elevated
atmospheric CO2 (2).

The following discussion focuses on temperature observations from meteorological
stations in the Arctic and surrounding areas. Is the temperature really rising
at an alarming rate? Has the well documented and rapidly increasing concentration
of greenhouse gases in the atmosphere really affected the temperatures at Arctic
stations, where, according to the models, this effect will first will be observed?

DATA
There are several data sets showing the temperature in the Arctic and its surrounding
areas. One set of data, which has been made available on the Internet during the last few
years, is the Nordklim database (3). This database includes temperature observations
made between 1890 (or from the time when observations were initiated) and 1999, and
has been obtained from meteorological stations in the Nordic countries (3). Information
about changes in the Arctic climate is also reported in several papers (4, 5, 6) and
data for many stations are available on the Internet (7).

For the purpose of this discussion, mean annual air temperature have been used.
Svalbard Lufthavn, located on a group of islands at 78N, is selected as representing climate
in the Arctic. The first few years of observations from this station may have been
affected by several shifts in the position, but this factor is not believed to have affected the record for the temperature
maximum that was reached during the late 1930s (Fig. 1). The Svalbard temperature
is compared with the annual mean temperature at Arctic stations. In addition,
the temperatures of the Arctic are compared with data from Stockholm, because
observations have been carried out there for 250 y, thus making it possible to place the
short Arctic record in a longer perspective. The long record has been corrected for
urban effect (8, 9). Corrections for urban effect are quite important, because it does
influence climate even when the population is relatively small (10).

RESULTS
The Svalbard mean annual temperature increased rapidly from the 1910s to the late
1930s. The temperature thereafter became lower, and a minimum was reached around
1970. Svalbard thereafter became warmer, but the mean temperature in the late 1990s
was still slightly cooler than it was in the late 1930s. Svalbard is, of course, only one
point in the vast Arctic area. However, the observed warming during the 1930s is
supported by data from several stations along the Arctic coasts and on islands in
the Arctic, e.g. Nordklim data from Bjornoya and Jan Mayen in the north Atlantic,
Vardo and Tromso in northern Norway, Sodankylaeand Karasjoki in northern Finland,
and Stykkisholmur in Iceland (3). There is also data from other reports; e.g.
Godthaab, Jakobshavn, and Egedesmindde in Greenland, Ostrov Dikson on
the north coast of Siberia, Salehard in inland Siberia, and Nome in western
Alaska (7). All these stations indicate the same pattern of changes in annual mean
temperature: a warm 1930s, a cooling until around 1970, and thereafter a warming,
although the temperature remains slightly below the level of the late 1930s. Although
details of the temperature fluctuations vary over time between the stations, the pattern
of these fluctuations remains similar. Many stations with records starting later
than the 1930s also indicate cooling, e.g. Vize in the Arctic Sea north of the Siberian
coast and Frobisher Bay and Clyde on Baffin Island (7).

In Stockholm, where temperature observations have been made since 1756, it is
apparent that the temperature has been affected by the growing city. This urban
effect has been studied in detail, and a compensation has been made for this
bias in the data used here (8, 9). The Stockholm temperature also increased

between the beginning of the century and the 1930s, then reached a minimum around
1970 and rose again, a pattern similar to the one observed in the Arctic. The 250-y long
Stockholm record shows that the fluctuations of the 1900s are not unique; changes of the same magnitude as in the
1900s occurred between 1770 and 1800, and distinct but smaller fluctuations occurred
around 1825 (8).

DISCUSSION
How can a distinct warming so often be reported for the Arctic areas when the temperature
observations indicate variations but no consistent trend? If it was clearly stated that the warming is predicted, not yet
indicated by empirical data, these claims could be accepted. The prediction is based on theory, but
the empirical data used to support the theory is misleading. For example, to select a
short period between a minimum and a high point, possibly a maximum, call it a trend
and use it in support of a theory, is not acceptable. Obviously, neither the difference
between two maxima nor the calculated regression during an arbitrarily selected
period is an acceptable measure of trend either. However, either of these two methods
is better than using nothing more than a temperature increase to indicate a trend.
A trend showing considerable warming in northern Siberia and also in parts of
Alaska has been used as an indication of a drastic warming in the Arctic. The temperature
increase in Siberia is based on a record restricted to 40 y (11), a period
following a cold period in the 1970s. The meteorological station Ostrov Dikson, located
in this area of warming climate, has reported temperatures since 1917. Regression
analysis of the 65-y annual temperature record after the maximum in the late
1930s indicates a temperature decrease of 1.58C between 1935 and 1999. The reported
increase is a result of the selected period (1960-2000). In addition, Alaska is reported
to have experienced a rapid warming. One of the stations often referred to is
Point Barrow on the north coast of Alaska, for which a recent study shows that even
the small settlement (4600 persons) creates an urban effect (10). This was measured to
be 2.28C during the winter of 2001-2002 (10).
There is no definite period of time that should be used to study temperature trends.
However, it is obviously wrong to start a calculation of a trend at a minimum and
finish at a high point (possibly a maximum. The result will be misleading; from a minimum
to a maximum there will always be an increase.
Ostrov Dikson has experienced a very distinct decrease in temperature since the
late 1930s. This decrease is unusually large, but a trend towards a cooling seems
to be typical for Arctic stations, e.g. Iceland (3) 1935-1999, -0.78C; Frobisher
Bay (7), 1942-1999, -0.88C). For other stations, e.g. Vardo (3), the cooling was
small, 0.18C. The data do not indicate a warming of the Arctic. The cooling after
the maximum in the 1930s occurred during the time when the concentration of carbon
dioxide in the atmosphere had increased markedly; thus, an increase in temperature
could be expected.
The increase in temperature during the early 1900s is considered to be caused by
increased solar irradiation (12). The temperature increase at several Arctic stations
was greater and more rapid during this earlier period when carbon dioxide, according
to models, did not contribute to the temperature. During the 50 y in which the
atmospheric concentration of CO2 has increased considerably, the temperature has
decreased. The Arctic temperature data do not support the models predicting that there
will be a critical future warming of the climate because of an increased concentration
of CO2 in the atmosphere. At a few locations in Europe, temperature
has been recorded for considerably longer than it has in the Arctic. The
Stockholm temperature record, which covers the last 250 y, has been discussed by
Moberg and Bergstrom (8) and by Jones et al. (9). The patterns of this record are
similar to those from the years of overlapping data (1930s to 1999), for the Arctic
stations (Fig. 1); therefore, it is probable that the Stockholm record can be used for
an estimate of variations in the Arctic climate. If there is a similarity between the
long Stockholm record, other European records (13), and the Arctic record, as the
overlapping period of records indicate, it is likely that the recorded fluctuations in the
Arctic temperature are short fragments of a series of fluctuations in the climate.
The so called “global temperature” (12), frequently taken as a proof of a human
influence on the climate, is based on a short record beginning during the cold second
half of the 1800s and going to the warm present. Even this global temperature record
begins during a very cold period and ends during a period that some scientists
claim to be the warmest in very long time. Considering the 250-y record from Stockholm,
it becomes obvious that these last 100 y of warming climate comprise only one
section of a fluctuating climate, or, in other words, a small fraction of a series of several
such fluctuations in the temperature. The so-called “global temperature record” covers
too short a period and therefore does not yield empirical support for an anthropogenic
effect on the climate, which so often is claimed.

CONCLUSIONS
The frequently mentioned rapid increase of the temperature in the Arctic is based on
a record beginning at a minimum in the temperature around the 1970s and ending
during a period of relatively warm climate. If the time series had begun a few decades
earlier, the records would indicate a cooling climate in the Arctic.
This conclusion does not prove that there is no global warming, but that the
observations available give no proof of such a warming.

Two major outlet glaciers in East Greenland … remained steady during the 1990s despite progressive and substantial thinning, but have abruptly increased within the last two years, more than doubling ice flux to the ocean. … comparable 1998 speed-up of Jakobshavn Isbr in West Greenland…. Now that two further Greenland outlets have exhibited similar behavior, a common process seems likely. …. a step-change in dynamics not included in current models. We should expect further Greenland outlet glaciers to follow suit.