IPCC and Model Projections

Intergovernmental Panel on Climate Change (IPCC) presents projections of climate change, which are based on computer models. The Fifth Assessment Report (AR5) Working Goup 1, "Climate Change 2013: The Physical Science Basis" was published online here on January 30, 2014. The projections given in the report are based on four scenarios, or Representative Concentration Pathways (RCP) which include different assumptions of CO2 and other greenhouse gas emissions. The names of the scenarios correspond to different target forcings at 2100 (compared to 1750), 2.6, 4.5, 6.0 and 8.5 W/m2. These RCPs replace the emission scenarios used in the fourth assessment report.

RCP2.6 is a strong mitigation scenario.RCP4.5 is a mitigation scenario where radiative forcing is stabilized before 2100.RCP6.0 is a slower mitigation scenario where radiative forcing is stabilized after 2100.RCP8.5 is an extreme emissions scenario where greenhouse gas emisions rate increases.

The graph below shows the CO2 concentration in air to the year 2050 for each RCP scenario. The light blue curve is the historical CO2 concentrations.

The CO2 concentrations of RCP2.6, RCP4.5 and RCP6.0 are similar up to 2030. RCP2.6 CO2 stabilizes shortly after 2040. The actual CO2 concentration increased at 0.54%/year from 2005 to 2013. The RCP8.5 CO2 concentrations increase at 1.00%/year by 2050, and at 1.16%/year by 2070, which is more than double the historical growth rate.

The CH4 (methane) concentrations are shown below.

The actual CH4 concentration increased at 0.2%/year from 2005 to 2010. RCP 4.5 and RCP6.0 also show little growth in CH4 concentrations. CH4 concentrations drop significantly in the RCP2.6 strong mitigation scenario, but increases at 1.34%/year by 2050 in the RCP8.5 scenario. The RCP8.5 is an extreme and unrealistic scenario as both CO2 and CH4 increases much faster than the historical changes.

The AR5 report shows that the cooling affects of aerosos are much less than previously believed, but there was not time to include these new estimates in climate models used for the report. A reduction in the aerosol cooling should also reduce the estimate of greenhouse forcing. No climate model was adjusted to match the lack of warming over the last 16 years, commonly known as the "pause" or "hiatus" of global warming. Therefore, the IPCC reduced its short term warming forecast by 40% relative to the climate model projections. The following graph shows the average RCP4.5 climate model forecast and the low, middle and high range IPCC forecast based on "expert judgement". The actual global temperatures as estimated by HadCRUT4 is shown.

The Technical Summary of AR5 gives table TS.1 which shows the climate model forecast temperature changes for each RCP scenario to 20-year time periods of 2046-2065 and 2081-2100 relative to the 1986-2005 average. The table below shows the projected temperature increase to the mid-point year relative to the 1986-2005 average and relative to 2013. The HadCRUT4 temperature in 2013 was 0.19 C higher than the 1986-2005 average.

Global Mean Surface Temperature Change deg. C

relative to 1986 to 2005

relative to 2013

Scenario

2055

2090

2055

2090

RCP2.6

1.0

1.0

0.8

0.8

RCP4.5

1.4

1.8

1.2

1.6

RCP6.0

1.3

2.2

1.1

2.0

RCP8.5

2.0

3.7

1.8

3.5

Kevin Trenberth is head of the large US National Centre for Atmospheric Research and one of the advisors of the IPCC. Trenberth asserts ". . . there are no (climate) predictions by IPCC at all. And there never have been". Instead, there are only "what if" projections of future climate that correspond to certain emissions scenarios. According to Trenberth, GCMs ". . . do not consider many things like the recovery of the ozone layer, for instance, or observed trends in forcing agents. None of the models used by IPCC is initialised to the observed state and none of the climate states in the models corresponds even remotely to the current observed climate." However, Scott Armstrong and Kesten Green audited the relevant chapter in the IPCC's latest report. They find that "in apparent contradiction to claims by some climate experts that the IPCC provides 'projections' and not 'forecasts', the word 'forecast' and its derivatives occurred 37 times, and 'predict' and its derivatives occur 90 times" in the chapter. Consequently, it is not surprising that the public has this misimpression that the IPCC predicts future climate.

Computer Models Fail

The computer models predict that the 20th century temperatures should have increased by 1.6 to 3.74 Celsius, while the actual observed 20th-century temperature increase was about 0.6 Celsius. A model that fails to history match is useless for predicting the future.

The chart below compares the surface warming projections of the 2007 IPCC report to the actual global temperatures as represented by the HadCrut3 index. The red, green and blue curves are temperature projections from the A2, A1B and B1 emission scenarios. The orange curve is the temperature projection assuming the CO2 levels stay constant at the year 2000 value. The pink curve is the annual HadCrut3 actual temperature measurements. The black curve is the Fast Fourier Transform (FFT) best fit to the data.

See here. The quoted error on a single measurement is 0.05 C. The probablility that the IPCC projections overstate the warming in greater than 90%.

The IPCC Fourth Assessment Report projected a surface temperature increase from 1990 to 2100 of 1.4 C to 5.8 C, corresponding to 0.13 C/decade to 0.53 C/decade. The IPCC low estimate corresponds to the actual temperature warming rate as measured by satellite data.

Dr. John Christy presented to the US Senate on August 1, 2012 the following graph of the results from 34 climate models that will be used in the IPCC's fifth assessment report. The thick black line is the mult-model mean hindcast and projection from 1975 to 2020. The graph also shows the surface temperature observations and satellite observations adjusted to surface temperatures.

The graph shows that these new climate model results are wildly different from the observations. The satellite observations show the temperature has increased by 0.2 Celsius from 1980 to 2010, but the climate models mean increase is 0.6 Celsius. Surface observations show a 0.35 Celsius increase from 1980, but as explained elsewhere in this document, the surface temperatures are contaminated by urban development. The model mean temperature increase from 1980 to 2010 is three times higher than the satellite observations so the forecasts are useless for making policy decisions. Dr. Christy's presentation is here.

The climate model temperature trend near the equator (5S to 5N) is 3.6 times the measured sea surface temperature trend from 1982 as shown below.

The IPCC assumes that the Sun has little effect, even though observational evidence clearly shows the Sun has a significant effect on climate.

The models assume the 20th century temperature rise is caused by CO2 increases, and parameters are set in the models to make the temperature rise in response to the CO2. The direct effect of increasing CO2 concentration on global warming is very small. All the models amplify an initial increase in temperature due to CO2 by employing water vapour and clouds as a large positive feed back. However, there is no evidence that water vapour and clouds provides a large positive feed back. They may provide a negative feed back.

The amount of solar energy the Earth recieves depends on the Earth's albedo, or reflectivity. The greater the albedo, the more sunlight is reflected and the less solar energy is absorbed by the Earth. Project "Earthshine" being done at the Big Bear Solar Observatory measures the Earth's albedo by observing the amount of sunlight reflected by the Earth to the dark side of the Moon and back to Earth. The process is shown below.

The results show that the Earth albedo has gradually fallen up to 1997, likely causing most of the global warming through 1998. Since 2001 the albedo increased rapidly, which has stopped the warming and resulted in the current global cooling. The recent dimming of the Earth is likely due to increased low cloud cover. The albedo is shown below.

The blue lines are the observed earthshine data for 1994-1995 and 1999-2003. The black line is the reconstructed albedo from partially overlapping satellite cloud data with respect to the mean of the calibration period 1999 to 2001. The vertical red line shows the cumulative climate forcing of the increase in greenhouse gases over the 20th century of 2.4 W/m2 according to the IPCC. Note that the change of the albedo's climate forcing in W/m2 is much greater than that due to greenhouse gases. Current climate models do not show such large albedo variability. See an article by Anthony Watts here for further information. See the project Earthshine site here.

Climate models utilize large grid blocks to simulate climate, which are too large to include thunderstorms or hurricanes, so they use parameterization to account for these. These parameterizations ignore real-world transfers of energy, moisture and momentum that could significantly alter the results and severely limits the usefulness of climate model projections. Computer models employ approximations to represent physical processes that cannot be directly computed due to computational limitations. Because many empirical parameters can be selected to force a model to match observations, the ability of a model to match observations cannot be cited as evidence that the model is realistic and does not imply it is reliable for forecasting climate. See the Fraser Institutes Independent Summary For Policy Makers.

Atmospheric methane concentrations have been declining in recent years. Methane is a significant greenhouse gas. Climate models assume that methane concentrations increase with temperature, and it is not known why its concentration is declining. Aerosols play a key role in climate, with a potential impact of more than three times that of CO2 emissions, but their influence is very poorly understood. Aerosols exert an overall cooling effect on climate but estimates of the effect vary by a factor of ten. Models used in the IPCC Fourth Assessment Report assume aerosols have a large cooling effect, thereby attributing a large warming effect to CO2.

Only 2 of the 23 models used by the IPCC account for varying Sun intensity, and these models do not assume the Sun affects the cosmic ray flux and cloud formation. Only 2 of the models account for land use changes.

Computer models predict warming at the north and south poles to be symmetrical, but there is a warming trend at the North Pole but not at the South Pole. They also predict that the polar surface regions will warm more than the surface at the tropics. Winter temperatures will warm more than summer temperatures; night-time temperatures will warm more than day-time temperatures. Therefore, according to the CO2 warming theory, winter nights in the arctic will warm, but there will be little summer day time warming in the tropics.

A team of four researchers from three American universities led by David Douglass compared the troposphere temperature trends in the tropics predicted from climate models to actual satellite and radiosonde observations. In a paper published in December 2007 by the Royal Meteorological Society, Douglass et al analysed the simulation results from 22 climate models at the surface and at 12 different altitudes. The simulation results were compared to the temperature trends determined from two analysis of satellite data and four radiosonde datasets for the period January 1979 through December 2004.

Computer Model Temperature Trends versus Observations

The above diagram shows the comparison of temperature trends from 1979 through 2004 of climate models and actual satellite and radiosonde observations, expressed as degrees Celsius per decade versus altitude and atmospheric pressure. The left panel shows four radiosonde results as IGRA, RATPAC, HadAT2 and RAOBCORE. The thick red line shows the mean of the 22 computer model results, and the models' 2 times standard error of the mean are shown as the two thin red lines. Temperature trends from three surface measurement datasets are identified in the legend by Sfc and are plotted on the left axis. The RSS and UAH analysis of satellite data are plotted on the right panel at two effective layers: T2lt represents the lower troposphere with a weighted mean at 2.5 km, T2 represents the mid troposphere with a weighted mean at 6.1 km altitude. A trend is the slope of the line that has been least-squared fit to the data. Synthetic model values corresponding to the effective layers of the satellite data are shown in the right panel as open red circles.

An essential place to compare observations with greenhouse computer models is the layer between 450 hPa and 750 hPa atmospheric pressure where the presence of water vapour is most important, and is called the "characteristic emission layer". In this layer, the observations are all outside the 2 times standard error test. The radiosonde and satellite trends are inconsistent with the model trends at all altitudes above the surface. Douglass et al. conclude that Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modelled trend is 100 to 300% higher than observed, and, above 8 km, modelled and observed trends have opposite signs. Therefore any projections of future climate from the models are very likely too high, and these projections should not be used to form public policy. See the paper "A comparison of tropical temperature trends with model predictions" here.

A technical paper published by R. McKitrick, S. McIntyre and C. Herman in Atmospheric Science Letters, August 2010 shows that the climate model temperature trends of the mid-troposphere, using 57 runs from 23 climate models, are four times larger than observations from satellites and weather balloons.

See here for a discussion by D. Stockwell and see the technical paper here.

Dr. Roy Spencer writes "Now, in what universe do the above results not represent an epic failure for the models?" here. "I frankly don’t see how the IPCC can keep claiming that the models are “not inconsistent with” the observations. Any sane person can see otherwise." See here. John Christy writes "All pressure levels are used in the radiosondes and models to generate the simulated satellite profile. All levels are used according to their proportional weighting of the [satellite] microwave emission function."

While air temperature may fluctuate from year to year as heat is transferred between the air and oceans, if CO2 is causing global warming by the IPCC hypothesis, the ocean heat content must increase monotonically provided there are no major volcanic eruptions. Ocean heat content is a much more robust metric than surface air temperature for assessing global climate change because the ocean's heat capacity is greater than that of the atmosphere by many orders of magnitude. For any given area on the oceans surface, the upper 2.6 m of water has the same heat capacity as the entire atmosphere above it! According the IPCC models, all major feedbacks are positive so there is no mechanism that would allow the heat content of the Earth to decline.

Heat accumulating in the climate system can be measured on a global scale from 2003 by the ARGO array of 3341 free-drifting floats that measure temperature and salinity in the upper 2000 m of ocean. The robotic floats rise to the surface every 10 days and transmit data to a satellite which also determines their location as shown below.

Dr. Craig Loehle, Ph.D. has analyzed the ocean heat content for a linear trend over 4.5 years of data from mid-2003 to the end of 2007. The data shows an annual variation because most of the oceans are in the southern hemisphere. To eliminate the annual cycle, a model was fit with slope, intercept, and sinusoidal (1-year fixed period) terms using nonlinear least-squares estimation. The linear component of the model shows a decline of 0.35 x E22 Joules/year. (The graph shows the recalibrated data, after the data from certain instruments with a cool bias were removed. Initial Argo results showed strong cooling.) The Argo heat content is shown below. See his paper here.

William DiPuccio compared the projected ocean heat content of the GISS climate model to two analyses of the ARGO heat content data. The projected heat content of the GISS model was adjusted to include only the upper oceans for comparison to the ARGO actual data. He also calculated a lower limit by scaling the net global anthropogenic radiative flux to ocean surface area. The observed ocean heat content trends were calculated by Josh K. Willis of NASA's Jet Propulsion Laboratory and Craig Leohle of the National Council for Air and Stream Improvement, Inc. Loehle's calculations have a smaller margin for error than Willis, because Willis only uses annual average data.

The heat deficit shows that from 2003-2008 there was no positive radiative imbalance caused by anthropogenic forcing, despite increasing levels of CO2. Indeed, the radiative imbalance was negative, meaning the earth was losing slightly more energy than it absorbed. The figure reveals a robust failure on the part of the GISS climate model.

William DiPuccio says, "Since the oceans are the primary reservoir of atmospheric heat, there is no need to account for lag time involved with heat transfer. By using ocean heat as a metric, we can quantify nearly all of the energy that drives the climate system at any given moment. So, if there is still heat in the pipeline, where is it? The deficit of heat after nearly 6 years of cooling is now enormous. Heat can be transferred, but it cannot hide." See his paper here.

Below is a graph which compares ARGO era (2003 to Q1 of 2011) the ocean heat content of the top 700 m from the National Oceanographic Data Center to the projections of the GISS climate model. The NODC OHC dataset is based on the Levitus et al (2009) paper which describes various adjustments and corrections to the data. The NODC data includes the ARGO data as described above and data from expendable bathythermographs. The GISS model projection is discussed here. The NODC data is here, and the graph is from here.

Note the enormous discrepancy between the measurements and the climate model projections.

The lack of global warming during the last 16 years has led to claims that the predicted warming has gone into the deep ocean below 700 m. The graph below shows the ocean heat content from NOAA by ocean depth layer; 0 to 700 m depth and 700 m to 2000 m depth.

The more heat that is transfered into the deep ocean, the less heat is left to warm the atmosphere. The graph above shows that both the layer 0 to 700 m and the layer 700 m to 2000 m has been gaining heat. The NOAA data for 0 to 2000 m from here starts in Q1 of 2005. While the lower layer has gain more heat than the upper layer, the rate of temperature rise in the two layers is similar due to the lower layer containing a larger volume of water. The units of heat content of Joules is not very meaningful to most people, so the graph below presents the same information but in average temperature change for each layer. The graph shows the 0 to 700 m temperature data from Q1 of 2003, which is usually considered the start of reliable ARGO data. The green line shows the GISS climate model projection (based on 0.7 x 10^22 Joules/year). The climate model warming trend of the 0 to 700 m layer is 3.7 times the trend of the measurements.

The temperature of the ocean layer 700 to 2000 m has increased by about 0.02 °C from Q1 2005 to Q3 2013.

One of the most important parameters in determining climate sensitivity in climate models is the amount of heat they transfer to the oceans. The following graph by Dr. Spencer compares the Levitus observations of ocean warming trends during 1955-1999 to 15 IPCC 4AR climate model runs.

Note that the climate models exhibit wildly different trends, with the deep ocean cooling just as often as warming. The green curve is the Levitus actual observation to a depth of 700 m. Most of the models produce too much warming in the layer to 700 m. Many models produce unexpected ocean cooling below 100 m while the surface warms. None of the models even remotely match the observations. The weak ocean warming in the 700 m layer suggests low climate sensitivity, even if all the warming was due to CO2 emissions. See here.

The graphs below in this section prepared by Bob Tisdale compare temperature series to hindcasts of computer models used by the IPCC. Computer model hindcasts should be compare to the actual historical observations to determine how well the models matched the historical record. A model that fails to history match will not produce realistic projections.

The animation below compares observed North Atlantic temperature anomalies to the modeled surface air temperatures for the 6 individual ensemble members and the ensemble mean of the National Center of Climate Research (NCCR) coupled climate model CCSM4. All data have been smoothed with a 121-month filter.

Bob Tisdale writes "The NCAR CCSM4 coupled climate model appears to do a poor job of hindcasting the multi-decadal variability of North Atlantic temperature anomalies." See here.

The animation below compareas the sea surface temperature (SST) in the NINO 3 region to the climate model hindcasts. NINO 3 is a region in the Eastern Pacific tropics where El Nino events occur. It shows how poorly the models hindcast the frequency, magnitude, and trend of ENSO events. The model ensemble mean trend is 14 times greater than the trend of the observations.

Bob Tisdale writes "the frequency and magnitude of El Nino and La Nina events of the individual ensemble members do not come close to matching those observed in the instrument temperature record. Should they? Yes. During a given time period, it is the frequency and magnitude of ENSO events that determines how often and how much heat is released by the tropical Pacific into the atmosphere ..."

The graph below compares the linear trends for the observations and the model mean of the IPCC AR5 hindcasts/projections of SST for the period of January 1982 to December 2014 in 5-degree-latitude bands. The models predicted much greater warming trends in the tropics than what was observed. The actual warming in the northern regions is greater than modeled. Warming was predicted in the southern region but the SST trend was actually negative in much of the region. This shows that the models do an extremely poor job at simulating how topical heat is transported to the Polar Regions. See here.

The sea surface temperatures from -50 to -80 degrees latitude (south) and from 50 to 80 degrees latitude (north) are shown below. The IPCC claims that CO2 is the main driver of climate change, but the best fit linear temperature trends declined at 0.4 C/century in the southern region and increased at 2.0 C/century in the northern region despite the fact that the are CO2 concentrations in the two regions are the same.

The graph below compares the Easter Pacific SST to models by latitude. This includes the important El Nino region so a good history match here is critical. The Eastern Pacific tropical SST has declined at the equator at 0.14 C/decade, but the models show a strong warming of 0.19 C/decade. See here.

The graph below compares SST observations to climate model outputs for the period of 1910 to August 2011. The SST is from the HADISST dataset and the model hindcast is the IPCC model mean published in 2007. The models do not match the temperature variability in the period 1910 to 1975. They are made to match the warming trend from 1975 to 2002 by assuming most of the warming is due to CO2 and using high sensitivity to greenhouse gases. The projections diverge from observations after 2002 despite the continued increase in CO2 emissions. See here.

The graph below shows the northern hemisphere sea surface temperature measurements and the climate model hindcasts for the period 1910 to 1944. The actual temperature rise was 4.5 times greater than the modeled trend. The models cannot replicate the measurements because they do not include natural causes of climate change. The graph is from here.

The global surface temperature trend from HadCRUT for the early 20th century warming period 1917 to 1944 at 0.174 C/decade is similar to the late warming period 1976 to 2005 at 0.195 C/decade as shown below. But the net anthropogenice forcing in the climate models during the late warming period is 3.8 times higher than the forcing during the early warming period. The 3.8 fold increase in forcing had almost no effect on the temperature trends of the two warming periods, indicating that the theory of anthropogenic global warming is seriously flawed. The graph from here.

The graph below compares the 17-year (240 months) trends of the global SST to the IPCC model mean. Each point on curves represents the 17-year straight-line best-fit trend to that point in time. The IPCC models projected the global 17-year SST trend ending August 2011 at 0.15 C/decade, but the observed rise was only 0.02 C/decade. See here.

Bob Tisdale writes, "The coupled climate models used to hindcast past and project future climate in the IPCC 2007 report AR4 were not initialized so that they could reproduce the multi-decadal variations that exist in the global temperature record. This has been known for years." and "The climate models used by the IPCC appear to be missing a number of components that produce the natural multi-decadal signal that exists in the instrument-based Sea Surface Temperature record."

The daily temperature range over land has been decreasing because the daily minimum temperatures (Tmin) has increased more than the daily maximum temperatures (Tmax) over the 20th century. The NOAA Global Historical Network database shows that 2/3 of the warming is due to the increase in the minimum temperatures. The trend of the difference between the maximum and minimim daily temperatures is called the Diurnal Temperature Range and it is a very important climate parameter. A paper by McNider et al (2012) shows that 6 climate models with published minimum and maximum temperatures replicate only 20% of the measured diurnal temperature trend as shown in the figure below. This is a five-fold climate model error.

Climate models are tuned to match only the 1970 to 2000 temperature rise of the average of the minimum and maximum temperatures (Tmean). If models are replicating Tmean but are not capturing the trend in Tmin, then this must mean that the model Tmax is warming faster than the actual Tmax. A computer analysis of the near surface boundary layer shows that an increase in greenhouse gases causes increased mixing of the boundary layer which brings warm nighttime air aloft down to the surface. Only 20% of the warming was due to longwave energy in the model simulation and 80% was due to increased turbulence. A layer only 20 to 50 m in thickness is warmed by this turbulence. The Tmax measured during the daytime represents a boundary layer 1 to 2 km deep. The climate models assume the Tmean represents an air thickness of 1 to 2 km, but it is actually only 20 to 50 m thick. The modeled Tmax is warming much faster and represents a much greater air thickness than the actual Tmax. See here.

Most of the warming in climate models is due to increasing water vapour as temperatures rise. The climate models greatly overestimate the Tmax trend, which represents the deep atmosphere, so they greatly overestimate the increase in water vapour in the lower atmosphere as well.

About 50% of human-caused CO2 emissions are absorbed by natural sinks and 50% remains in the atmosphere. The fraction of emissions that are sequestered in sinks is called the sink efficiency. The graph below shows that as more CO2 is produced, the fraction of emissions that is sequestered in sinks has increased at 1.1%/decade.

Most of the models forecast the sink efficiency will decline so that the CO2 concentration in the atmosphere will rise by an additional 50 to 100 ppm by 2100 compared to a constant sink efficiency. But the actual sink efficiency change is in the opposite direction of the climate models so it is likely the CO2 content will rise slower that climate model predictions. A paper that discusses the climate model sink efficiency forecasts is here. The annual CO2 concentration data from Mauna Loa is here. The annual CO2 emissions data is here.

Many important inputs to climate models are very uncertain and real world observational evidence does not support them, so it is foolish to rely on their projections to make expensive policy decisions.