Navigation

Post navigation

Climate Change Models Predictive Accuracy

I am an old man, dating from the last ice age, or at least a time when we were all talking of a potential ice age coming. In my lifetime the global warming idea has risen. Climate is a hard thing to predict.

The global warming movement is built on the computer models ability to predict future climate change based on changes in certain atmospheric gasses, commonly referred to as greenhouse gases. The question isn’t is the world warmer but rather is the warming caused by rising gasses that we are creating. The credibility of our ability to accurately predict with these models is key.

Back in 2008 we were staring at rising CO2 and flat temperatures for a decade. Those who reject global warming were adamant the disparity between constantly rising greenhouse gases and flatline temps showed the models were false. That charge was refuted in the 2008 Climate Report published the National Climatic Data Center, part of NOAA. On page 24 they make the following observations.

We can place this apparent lack of warming in the context of natural climate fluctuations other than ENSO using twenty-first century simulations with the HadCM3 climate model (Gordon et al. 2000), which is typical of those used in the recent IPCC report (AR4; Solomon et al. 2007). Ensembles with different modifications to the physical parameters of the model (within known uncertainties) (Collins et al. 2006) are performed for several of the IPCC SRES emissions scenarios (Solomon et al. 2007). Ten of these simulations have a steady long-term rate of warming between 0.15°

and 0.25ºC decade–1, close to the expected rate of 0.2ºC decade–1. ENSO-adjusted warming in the three surface temperature datasets over the last 2–25 yr continually lies within the 90% range of all similar-length ENSO-adjusted temperature changes in these simulations (Fig. 2.8b). Near-zero and even negative trends are common for intervals of a decade or less in the simulations, due to the model’s internal climate variability. The simulations rule out (at the 95% level) zero trends for intervals of 15 yr or more, suggesting that an observed absence of warming of this duration is needed to create a discrepancy with the expected present-day warming rate.

The 10 model simulations (a total of 700 years of simulation) possess 17 nonoverlapping decades with trends in ENSO-adjusted global mean temperature within the uncertainty range of the observed 1999–2008 trend (−0.05° to 0.05°C decade–1). Over most of the globe, local surface temperature trends for 1999–2008 are statistically consistent with those in the 17 simulated decades (Fig. 2.8c). Field significance (Livezey and Chen 1983) is assessed by comparing the total area of inconsistent grid boxes with the range of similar area values derived by testing the consistency of trends in each simulated decade with those in the remaining simulated decades. The 5.5% of the data area that is inconsistent in the observed case is close to the median of this range of area values, indicating the differences are not field significant. Inconsistent trends in the midlatitude Southern Hemisphere strongly resemble the surface temperature pattern of the negative phase of the SAM (Ciasto and Thompson 2008), which did indeed show a negative trend in the last decade.

These results show that climate models possess internal mechanisms of variability capable of reproducing the current slowdown in global temperature rise. Other factors, such as data biases and the effect of the solar cycle (Haigh 2003), may also have contributed, although these results show that it is not essential to invoke these explanations. The simulations also produce an average increase of 2.0°C in twenty-first century global temperature, demonstrating that recent observational trends are not sufficient to discount predictions of substantial climate change and its significant and widespread impacts. Given the likelihood that internal variability contributed to the slowing of global temperature rise in the last decade, we expect that warming will resume in the next few years, consistent with predictions from near-term climate forecasts (Smith et al. 2007; Haines et al. 2009). Improvements in such forecasts will give greater forewarning of future instances of temporary slowing and acceleration of global temperature rise, as predicted to occur in IPCC AR4 projections (Easterling and Wehner 2009).

The synopsis being, periods without warming, and even temperature declines of a decade or less are in fact predicted by our current models. Only five percent of the time do models then in use allow for a 15 year time period of non warming. An absence of observed warming for 15 years is needed to create a discrepancy between current models and observation.

Its now 2013 and latest climate figures, from last year, show that we’ve had stable average temps for the last 16 years. There is lots of yelling back and forth. What appears to be true is that these are the warmest years on record in some time. What is being questioned and/or proven is whether the green house gasses are causing the warming. Based on accepted ice core data, these warm years are by no means the warmest, rather they are near but not yet at the top of an observed naturally occurring range over the last 350,000 years. CO2 levels are currently outside of the historical range, by a significant amount. They are about 390 ppm now, whereas historically they varied between 180 to 300 ppm.

If the summary of climate modeling presented by NOAA in 2008 was accurate we must now conclude > 95% of the results of climate models in use in 2008 did not predict the 10-15 year flat line temperatures in the face of constantly rising greenhouse gasses. As each successive year of flat temps in the face of ever rising CO2 temps passes we move to a smaller and smaller number of models able to account for the observations.

The charge has often been that the models are not predictive but rather are constantly being revised when they are at variance with observation. Its a question of predictive credibility. The dire consequences the models predict are what give rise to the urgency of acting immediately. In response to the observed variance between rising gas levels and stable temperatures, various natural causes are put forward, and they may be valid. Including them in the models however will result in lower temperature rises, less urgency.

I would feel a whole lot better if what we were hearing was “Yes over 95% of our predictions from the models in use in 2008, predicted flat temps at given CO2 levels for a 10 to 15 year period but fail to predict stable temperatures beyond that period, what we are now seeing.” We are hearing quite the opposite.

The models exhibit large variations in the rate of warming from year to year and over a decade, owing to climate variations such as ENSO, the Atlantic Multi-Decadal Oscillation and Pacific Decadal Oscillation. So in that sense, such a period is not unexpected. It is not uncommon in the simulations for these periods to last up to 15 years, but longer periods are unlikely.

Comments like this are being made by other climatology sources. A result that happens only 5% or fewer times, for the 10-15 year period, and nil beyond, is not what I would call common. If we’ve revised our models since 2008 and it is now common with current models then say so. Admit that the original models were flawed, the dire predictions are therefore not quote so dire, but then say that we have new models indicating that the rate of temp increase is moderating. Are we seeing true transparency or not ?

I’m content to wait another five years to see where we are before reaching any conclusions. Does the stable temps turn into cooling, further compounding the question. Do they continue flat and thus negate the models. Do they start back upward. In the meantime I will continue to ponder the fact that observational results in the ice cores, and in the science back in the dark ages when I learned it, CO2 levels trailed changes in temperature not the other way around.