Why models run hot: results from an irreducibly simple climate modelAn irreducibly simple climate-sensitivity model is designed to empower even non-specialists to research the question how much global warming we may cause. In 1990, the First Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) expressed “substantial confidence” that near-term global warming would occur twice as fast as subsequent observation. Given rising CO2 concentration, few models predicted no warming since 2001. Between the pre-final and published drafts of the Fifth Assessment Report, IPCC cut its near-term warming projection substantially, substituting “expert assessment” for models’ near-term predictions. Yet its long-range predictions remain unaltered. The model indicates that IPCC’s reduction of the feedback sum from 1.9 to 1.5 W m−2 K−1 mandates a reduction from 3.2 to 2.2 K in its central climate-sensitivity estimate; that, since feedbacks are likely to be net-negative, a better estimate is 1.0 K; that there is no unrealized global warming in the pipeline; that global warming this century will be IPCC in its Fourth and Fifth Assessment Reports that are highlighted in the present paper is vital. Once those discrepancies are taken into account, the impact of anthropogenic global warming over the next century, and even as far as equilibrium many millennia hence, may be no more than one-third to one-half of IPCC’s current projections.

A major peer-reviewed climate physics paper in the first issue (January 2015: vol. 60 no. 1) of the prestigious Science Bulletin (formerly Chinese Science Bulletin), the journal of the Chinese Academy of Sciences, exposes elementary but serious errors in the general-circulation models relied on by the UN’s climate panel, the IPCC. The errors were the reason for concern about Man’s effect on climate. Without them, there is no climate crisis.

The IPCC has long predicted that doubling the CO2 in the air might eventually warm the Earth by 3.3 °C. However, the new, simple model presented in the Science Bulletin predicts no more than 1 °C warming instead – and possibly much less. The model, developed over eight years, is so easy to use that a high-school math teacher or undergrad student can get credible results in minutes running it on a pocket scientific calculator.

The paper, Why models run hot: results from an irreducibly simple climate model, by Christopher Monckton of Brenchley, Willie Soon, David Legates and Matt Briggs, survived three rounds of tough peer review in which two of the reviewers had at first opposed the paper on the ground that it questioned the IPCC’s predictions.

A new Duke University-led study finds that most climate models likely underestimate the degree of decade-to-decade variability occurring in mean surface temperatures as Earth’s atmosphere warms. The models also provide inconsistent explanations of why this variability occurs in the first place.

These discrepancies may undermine the models’ reliability for projecting the short-term pace as well as the extent of future warming, the study’s authors warn. As such, we shouldn’t over-interpret recent temperature trends.

“The inconsistencies we found among the models are a reality check showing we may not know as much as we thought we did,” said lead author Patrick T. Brown, a Ph.D. student in climatology at Duke’s Nicholas School of the Environment. “This doesn’t mean greenhouse gases aren’t causing Earth’s atmosphere to warm up in the long run,” Brown emphasized. “It just means the road to a warmer world may be bumpier and less predictable, with more decade-to-decade temperature wiggles than expected. If you’re worried about climate change in 2100, don’t over-interpret short-term trends. Don’t assume that the reduced rate of global warming over the last 10 years foreshadows what the climate will be like in 50 or 100 years.” Brown and his colleagues published their findings this month in the peer-reviewed Journal of Geophysical Research.

To conduct their study, they analyzed 34 climate models used by the Intergovernmental Panel on Climate Change (IPCC) in its fifth and most recent assessment report, finalized last November. The analysis found good consistency among the 34 models explaining the causes of year-to-year temperature wiggles, Brown noted. The inconsistencies existed only in terms of the model’s ability to explain decade-to-decade variability, such as why global mean surface temperatures warmed quickly during the 1980s and 1990s, but have remained relatively stable since then.

“When you look at the 34 models used in the IPCC report, many give different answers about what is causing this decade-to-decade variability,” he said. “Some models point to the Pacific Decadal Oscillation as the cause. Other models point to other causes. It’s hard to know which is right and which is wrong.” Hopefully, as the models become more sophisticated, they will coalesce around one answer, Brown said. Co-authors on the new study were Wenhong Li of Duke’s Nicholas School, and Shang-Ping Xie of the Scripps Institution of Oceanography at the University of California San Diego.