Climate Science Glossary

Term Lookup

Settings

Use the controls in the far right panel to increase or decrease the number of terms automatically displayed (or to completely turn that feature off).

Term Lookup

Term:

Settings

Beginner Intermediate Advanced No DefinitionsDefinition Life:

All IPCC definitions taken from Climate Change 2007: The Physical Science Basis. Working Group I Contribution to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Annex I, Glossary, pp. 941-954. Cambridge University Press.

Posted on 27 May 2013 by dkaroly

Today we released research which reduces the range of uncertainty in future global warming. It does not alter the fact we will never be certain about how, exactly, the climate will change.

We always have to make decisions when there are uncertainties about the future: whether to take an umbrella when we go outside, how much to spend on insurance. International action on climate change is just one more decision that has to be made in an environment of uncertainty.

The most recent assessment of climate change made by the Intergovernmental Panel on Climate Change in 2007 looked at what is known with high confidence about climate change, as well as uncertainties. It included projections of future global warming to the end of this century based on simulations from a group of complex climate models.

These models included a range of uncertainties, coming from natural variability of the climate and the representation of important processes in the models. But the models did not consider uncertainty from interactions with the carbon cycle – the way carbon is absorbed and released by oceans, plant life and soil. In order to allow for these uncertainties, the likely range of temperature change was expanded.

Our recent study has re-visited these results and tested an approach to reduce the range of uncertainty for future global warming. We wanted to calibrate the key climate and carbon cycle parameters in a simple climate model using historical data as a basis for future projections. We used observations of atmospheric carbon dioxide concentrations for the last 50 years to constrain the representation of the carbon cycle in the model. We also took the more common approach of using global atmospheric and ocean temperature variations to constrain the response of the climate system.

This led to a narrower range of projected temperature changes for a given set of greenhouse gas emissions. As a consequence, we have higher confidence in the projections. In other words, using both climate and carbon dioxide observations reduces the uncertainties in projections of global warming.

We found that uncertainties in the carbon cycle are the second-largest contributor to the overall range of uncertainty in future global warming. The main contributor is climate sensitivity, a measure of how the climate responds to increases in greenhouse-gas concentrations.

Unlike that study, our results do not show lower climate sensitivity or lower mean projected global warming. Our study uses the same observed global atmospheric and ocean temperature data. But we also used observed carbon dioxide data and represented important additional processes in our simplified climate model, particularly the carbon cycle on the land and in the ocean and uncertainties in the climate forcing due to aerosols.

In our study, the reductions in uncertainty came from using the observations, the relationships between them and how these affect the parameters in the simple climate model. We found 63% of the uncertainty in projected warming was due to single sources, such as climate sensitivity, the carbon cycle components and the cooling effect of aerosols, while 37% of uncertainty came from the combination of these sources.

Once we reduced the uncertainty we found there is an increased risk of exceeding a lower temperature change threshold, but a reduced chance of exceeding a high threshold. That is, for business-as-usual emissions of greenhouse gases, exceeding 6°C global warming by 2100 is now unlikely, while exceeding 2°C is virtually certain.

These results reconfirm the need for urgent and substantial reductions in greenhouse gas emissions if the world is to avoid exceeding the global warming target of 2°C. Keeping warming below 2°C is necessary to minimise dangerous climate change.

It is unlikely that uncertainties in projected warming will be reduced substantially. Indeed, if you allow for population growth, levels of economic activity, growth in demand for energy and the means of producing that energy, overall uncertainty increases. We just have to accept that we will have to manage the risks of global warming with the knowledge we have. We may not know exactly how much and by when average temperatures change, but we know they will. This is an experiment we probably don’t want to make with the only planet we have to live on.

Roger Bodman received funding from the Australian Research Council while completing his PhD.

David Karoly receives funding from the Australian Research Council and the Australian Antarctic Division. He is a Chief Investigator in the ARC Centre of Excellence for Climate System Science, a member of the Climate Change Authority and a member of the Science Advisory Panel to the Climate Commission.

Comments

Sadly "The Australian " newspaper today (27 May) was up to its usual trick of misrepresenting the science by stating in the opening sentence "The earth's temperature is unlikely to increase by more than 2C by 2100.." whereas the authors are clearly saying that "exceeding 2C is virtually certain." This is typical of the Murdoch press as they continually put the best face on the climate science even if it means that publishing misinformation.

tonyabalone @1, the Australian is schizophrenic on this study. In print the headline the article "Scientists now expect 2C rise", which is doubly misleading. First because it lacks the words "at least", and second because it suggests this is news, ie, that scientists have not been expecting at least a 2 C rise since about 1990.

The then state (as you point out):

"The earth's temperature is unlikely to increase by more than 2C by 2100 - significantly less than earlier predictions - assuming carbon dioxide emissions are substantially reduced."

This contrasts sharply with the abstract of the study which states:

"This results in an increased probability of exceeding a 2 °C global–mean temperature increase by 2100 while reducing the probability of surpassing a 6 °C threshold for non-mitigation scenarios such as the Special Report on Emissions Scenarios A1B and A1FI scenarios6, as compared with projections from the Fourth Assessment Report7 of the Intergovernmental Panel on Climate Change."

(My emphasis)

The abstract does not discuss scenarios with substantially reduced emissions at all.

The in print article goes on to say:

"The results are based on a scenario in which action is taken to mitigate emissions, though no reduction target is specified. It says under a 'business as usual' approach, temperatures are likely to rise by more than 2 C."

Well, I guess 4 C (the expected rise under BAU) is more than 2 C, so it is not exactly a lie - but hardly informative.

"Doubt will remain on climateBY:MITCHELL NADIN From: The Australian May 27, 2013 3:00AM

SCIENTISTS have narrowed the range of possible global temperature rises due to greenhouse gas emissions, but say uncertainty will always remain because of the complexity of factors in climate change.

Research conducted at the Melbourne University, and published today in Nature, found previous estimates of a 6C rise in temperatures by 2100 were "unlikely", but that exceeding a 2C change was "very likely" given business-as-usual emissions."

Same author, same study, different title and a balanced if much truncated text. Oddly, that article is listed as being posted "1 hour ago", ie, 12:30 PM AEST, not the 3:00 AM listed under the byline. My guess is that complaints were made about the transparent and misleading bias of the inprint article, resulting it being pulled online and a better version substituted. Without, of course, any admission of error.

Tom Curtis @ 2 yes I suspect that you are correct that the paper has been bombarded with complaints. I fired off an email to the editor compaining about the misinformation in the article. It won't be published of course. Limited News is very careful about their image and they don't take kindly to criticism, even when they know it is justified.

tonyabalone @3, agreed about The Australian's publications of letters. I have never had one of my letters pointing out (even subtely) their constant misrepresentation of the science actually published. I'm now trying a new tactic. I just sent of my email complaining about the article, but copied it to David Karoly and mediawatch (mediawatch@your.abc.net.au. Perhaps the knowledge that they cannot simply disappear the email will encourage them to publish. And if not, perhaps mediawatch will take an interest.

A recent study by Alexander Otto of Oxford University and colleagues, published in the journal Nature Geoscience, also considered future global warming in the context of observations of global mean temperature change over the last decade.

I'm puzzled by this sentence, since it seems to be based on the BBC's misrepresentation of the Otto paper, rather than what the Otto paper actually says.

The Otto paper is not based on temperature changes over the last decade - in fact Otto et al totally ignore changes over the last decade (except in OHC). Instead they take the average temperature over the last decade (and also the previous 3), and compare it to the average of a baseline period from 1860-1880. And the last decade does not stand out in any way - they find similar results for the 00's, the 90's, and the 80's. This part of the calculation is trivially reproducable.

Otto et al is not without it's limitations:

HadCRUT4 only covers 5/6 of the planet even in recent decades. My best estimate is that this causes them underestimate the temperature change by 5%, although the global temperature distribution for the 1860s is based in totally inadaquate information and so I'm guessing this number should be closer to 10%.

They note in the text that they are calculating effective climate sensitivity, which provides a lower bound for equilibrium sensitivity. They reference Armour et al, which finds that the effective sensitivity is about 15% lower than the equilibrium sensitivity.

If correct, these two factors alone would bring the Otto ECS to 2.4C, which is much more plausible, however the EffCS-ECS correction is currently based on only one paper (and my coverage work is not published at all). Until this is better constrained I wish they would be a bit clearer about the distinction between EffCS and ECS.

The method from the Otto paper is also totally dependent on the size of the forcings. They use forcings from this paper by Forster et al which looks very interesting indeed. I would like to try this data in my response function model.

Kevin - that makes no sense. How can they calculate the transient climate response (TCR) and equilibrium climate sensitivity (ECS) without feeding in the change in (near-surface air) temperature for each period? Change in (surface) temperature is an essential input into the equation.

In figure 1 of Otto (2013) they even give a break-down of TCR and ECS for each decadal period - from the 1970's to 2000's.

Professor Myles Allen (one of the authors of the Otto study) has written a strange article in that rag, the Daily Mail, advocating a new departure to seriously research carbon capture and sequestration (CCS). Naturally, the Mail distorts his views in its headline (and you could almsot say it serves him right!:))

Shoyemore - I understand one of the Skeptical Science authors has volunteered to write about the Myles Allen piece. I'm not sure how he plans to overcome the laws of thermodynamics with this carbon capture scheme. Has the hallmarks of perpetual motion machine crankery about it.

I concur with your doubts. David Keith looks to me as the guy who cares about the mythical limitless growth only rather than sustainability. Not even mentioning his perpetually-moving optimism about the law of thermodynamics. I wonder why the interviewer on this YT video didn't ask for that, I would have pressed the guy here and won't let him evade it.

I think the climate change due to athmosphere being treated as CO2 dumping ground is part of even larger problem facing overpopulated earth: complete disregard on sustainabily principles by the ravaging homo "sapiens". This larger context is not to be ignored, because even if we solve/stabilise CO2 problem soon, we may end up exposing another, equally alarming environmental degradation problem.

Meanwhile it's worth remembering This article + comments, which apears to be the latest SkS stance on this topic.

These are compared to the change between the two periods in either forcing or (forcing-uptake) for TCR or EffCS. The trends within those periods do not appear anywhere in the calculation. Uptake is assumed to be 0 for the early period - i.e. the ocean is assumed to be equilibriated.

There is a copy of Otto et al online here. It's worth reading what the paper actually says, because it bears little resemblence to what most secondary sources say the paper says.

Kev, my bad. And I did read the paper! Mind you the first sentence in the paper is misleading.

It's not easy tracking down all the sources for their inputs into the calculations, but their choice of climate forcing seems to be the clincher. The use of the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) forcing moves the equilibrium climate sensitivity back up to 2.4°C - closer to the central estimate using other methods.

If the models are wrong the subsequent analysis will be of no value. I would like to have seen them use a starting date for their simulation of 1850 so we can compare what actually happened with what their simulation says should have happened. The real world should be a combination of greenhouse gas causes and other causes. If the simulation cannot accurately predict what has actually happened between 1850 and now then its predictions for beyond now are of no value.

20c3m experiments are forced with historical green house gas forcing as well as the time varying ozone, sulfate, volcanic aerosols, and solar output for the 1900-2000 period. This analysis uses 20th century experiments from 10 models. The B1, A1B and A2 experiments are forced with a predicted green house gas forcing scenario for the 2000-2100 period. (www)

so we have the 1900-2000 period to judge the model's performance. IMHO it is hubris to think that the worlds leading climate modellers didn't think to do this, and hubris2 not to bother checking first before posting.

Dikran Marsupial, 1850 to 2012 is much more challenging because of the changes in direction. To be of any value such simulations have to be appropriated calibrated and contain the appropriate balance of human caused and natural factors. A simple model has been used but how has it been calibrated with reality. I would not automatically assume that things are done correctly. In quality, the projections do not look much like what has been happening over the past century and a half.

William Haas, the construction of the models is described in great detail in the appropriate places that you easily can find by following the provided links and merely clicking around a bit. Your extreme skepticism is unconvincing since it is based on your ignorance that is due purely to your failure to bother reading.

William, and yet the modellers have done what you suggest as the links provided show, and funnily enough do just as you propose as reading the papers would show you. "The appropriate balance" being of course the forcing set used which each model run explicitly shows you. "I would not automatically assume things are done correctly" - well good, but have you bothered to read how in fact it was done? The fact that you get similar results from completely different models, working in independent groups worldwide should at least suggest that maybe they are on track. Are you looking for the truth or looking for a rationalisation for inaction? If the former, then we are here to help - quote science papers in discussion though. If the later, then I would suspect ideological problems with proposed solution. Reality v. ideology. Hmm.

wild monkeys #21: Yes, your interpretation is correct, there is about a 1 in 20 chance of global temperatures remaining below 2.3C above pre-industrial in 2100, and a 1 in 20 chance of global warming exceeding 9C.

William #12, #18: The simulations with this simple model start before 1850 and agree well with the observations for the period 1850 to 2010 when forced by the observed and estimated emissions. We are putting together a figure to show this. That result is not surprising, as the model parameters are constrained using the carbon dioxide, global mean temperature, ocean heat content and land-ocean temeprature contrast observations from the 20th century.

tcflood #16: This simple climate model is essentially a four-box energy balance anomaly model of the NH and SH land and ocean regions separately. It was developed originally by Tom Wigley and is called MAGICC (Model for the Assessment of Greenhouse gas Induced Climate Change). It has input parameters which represent the climate sensitivity and the exchanges of heat between the land and the ocean and between the hemispheres, which are estimated from the more complex models or from observations. This type of model has been used in all IPCC assessments since 1990. Information and links to download an older version of MAGICC are available at www.cgd.ucar.edu/cas/wigley/magicc/or the latest version at www.magicc.org/ .

The advatage over the more complex models is that it can provide probabilistic estimates of future temperature change given emission scenarios and probability estimates of the model parameter values. More information on the model version we have used and our approach is given in a recent paper at www.bom.gov.au/amoj/docs/2012/bodman.pdf

00

Response:

[JC] David Karoly emailed me a graph comparing his model output to observations, which I've added to the original post above.

William Haas wrote: "1850 to 2012 is much more challenging because of the changes in direction."

Specifically what "changes in direction" do you refer to that are in the period 1850-2012 but not in 1900-2000 and explain why they should be challenging (this should include a discussion of relevant forcings for those periods)?

dkaroly #22, In your answer to wild monkeys, if I understand confidence intervals correctly, at 95% confidence it would be a 2.5% chance of rising more than 9 degrees, and an equal chance of less than 2.3 degrees. That's 1 in 40 at either end. It would be 1 in 20 of rising more than 7.8 degrees, and 1 in 6 of more than 6.2 degrees. That's still pretty damn scary. Of course, we could luck out and get "only" 2 or 3 degrees of rise. I'm not investing in Great Plains cropland.

Estiban #25, You are correct, I made a mistake in my earlier response. There is a about a 1 in 40 chance of global mean temperatures remaining below 2.3C above pre-industrial in 2100, and a 1 in 40 chance of global warming exceeding 9C.

The article says that the last IPCC assessment on climate change was made back in 2007. That was more than 5 years ago. The authors apparently feel that there are inadequacies in the old models so they are offering their new simplified models. The article says, "we wanted to calibrate the key climate and carbon cycle parameters in a simple climate model using historical data ..." They talked about 50 years. I am saying that their simulation would be a lot more credible if they calibrated it over a 150 year period.

00

Moderator Response:

[JH] You have articulated your position more than once now. Plese note that excessive repitition is prohibited by the SkS Comments Policy.

William Haas: Your most recent comment was a moderation complaint and was therefore deleted. Please loose the sarcastic tone as well. You should be able to answer the questions posed by Dikran Marsupial without repeating yourself.