I believe this is biology's century. I've covered science and medicine for Forbes from the Human Genome Project through Vioxx to the blossoming DNA technology changing the world today.
Email me, follow me on Twitter, circle me onGoogle Plus, or subscribe to my Facebook page.

Why Skepticism About Computer Models Is Not A Good Reason For Skepticism About Climate Change

A week ago I made the case that human-caused climate change should stop being a political football and that both parties should simply accept it as a reality. One pretty pervasive argument against global warming that I didn’t address in that article was this one: that the belief in global warming is based on computer models that are inherently unreliable. I sent two of the most trenchant comments from that piece, from reader gwkimball and my Forbes colleague Daniel Fisher, to Donald Wuebbles, the Harry E. Preble Professor of Atmospheric Science at the University of Illinois, who I interviewed for the article. Below is an edited version of the question and of his response.

Update: The short version of this: all sciences use models; you don’t need the models to prove that climate change is true.

All human-caused warming claims – ALL – depend on the computer climate models to back out the ‘human’ component originating in carbon dioxide. We just learned in the financial crisis, relying too much on mathematical models to predict the behavior of complex systems can be dangerous. Everybody knows it has been getting warmer since the ice age. How can the scientists convince skeptics a) there has been a non-random acceleration in the past 150 years and b) it is due to burning fossil fuels?

Wuebbles responds:“The basis for human activities being the primary driver of the recent changes in climate is observation and the physical understanding of the processes that drive our climate system. Models of these physical processes further add to this understanding, especially for the future projections of further changes, but the basic science is based on observation.

“It should be noted that people do depend on models throughout their lives. Airplanes are designed using models, flown using models, and the weather analyses they use are based on the models that are very much akin to our climate models. The models we use in study of the Earth’s climate system are basically the integrators of our knowledge of how the atmosphere and Earth’s climate work, and they do quite a good job of explaining it.

“We depend heavily on observations in our analyses. The well-measured CO2 increase exactly fits with our understanding of the carbon cycle as being associated with the burning of fossil fuels and land use change. There is NO other explanation for the increase in CO2. You can’t just say it is natural cycles when the ice core observations show that CO2 levels haven’t been as high as they are now (or even close to as high as they are now) for over 800,000 years.

“Going further, we have observation-based analyses of the past 2000 years that tell us the changes in climate we have seen in the last 50 years are way outside the norm. Plus they fit with our basic understanding of the greenhouse effect that go back nearly two centuries. There are no natural cycles that can explain all of the observed changes we are seeing on Earth — the atmospheric is warming, the oceans are warming, the land is warming, and the ice is melting. Climate does not change randomly, it changes beyond the normal natural variability due to external forcings. In the past, nature played the major role in those through changes in the solar flux or through large volcanic eruptions. But nature cannot explain what we have seen the last 50 years (the Sun has actually decreased in flux slightly over that time). But the greenhouse effect and the increasing greenhouse gases do fit.

“What we are experiencing is outside of anything humans have seen on our planet and the only explanation that makes any real sense is that it is due to human actions.”

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

This a logarithmic scale on the left and each numbered division increases by a power of 10. The fastest known time in history for increased co2 was the PETM (paleocene eocene thermal maxim). We are exceeding the PETM by at least a power of 10 if not more. The PETM had a mass extinction based on a much slower co2 increase than we are having now. If you look at the bottom of the scale you can see temperature rise per year. We are exceeding the PETM there also.

Based on history repeating itself, massive changes in the atmospheric co2 is hard on life on earth.

As usual, writers who believe the CAGW theory seek their information from supposed experts who are already on record as also believing in CAGW.

If you are going to talk modeling, especially as it relates to forecasting, why wasn’t J. Scott Armstrong consulted. In 2001, he, along with a team of 39 international scientists from various fields, published a book called “The Principles of Forecasting”. In this book, they outlined 139 scientific principles that should be followed to create a model.

In 2007, Armstrong (Univ. of Penn.) and Green (Univ. of S. Australia) audited the IPCC’s procedures for the models used to project future warming. The audit showed that the models violated 72 out of 89 relevant scientific forecasting principles. Would you fly in an airplane if you knew the models contained this level of divergence from principles?

Scientists from both NCAR and NASA GISS are on record stating that their models lack veracity and no longer call them predictions or even projections. They are referred to as scenarios because these scientists understand that this is what MIGHT happen if they have correctly captured and correctly weighted the effect of each variable that affects climate. To date, models still parameterize some of the variables because scientists do not understand how a variable, like clouds, operate.

Climate models based their validity of CO2 being the MAJOR forcing source on the warming of roughly 1970 – 2000. Had that assumption been true then the global average temperature would have continued increasing along the same slope as that of 1970 – 2000. But it hasn’t. Both James Hansen (NASA GISS) and Phil Jones (UAE) are on record stating that the warming has stalled. And neither have offered an explanation as to why.

Skeptical scientists have stated for many years that models overstate the sensitivity of the biosphere to CO2. We now have definitive empirical data that this is true. Yet CAGW proponents refuse to accept what the scientific method tells them is true.

If Mr. Armstrong is correct, he could make a case by writing his own paper to overturn the science. I’m under the impression he did not do this. It appears he wasn’t willing to back up his words with actions.

http://en.wikipedia.org/wiki/J._Scott_Armstrong

Armstrong examined the methods used by the IPCC to make projections. In an article published in Energy & Environment, he claimed that the IPCC and climate scientists have ignored the scientific literature on forecasting principles.[3][11][12] Armstrong wrote:

When we inspected the 17 [forecasting] articles, we found that none of them referred to the scientific literature on forecasting methods. It is difficult to understand how scientific forecasting could be conducted without reference to the research literature on how to make forecasts. One would expect to see empirical justification for the forecasting methods that were used. We concluded that climate forecasts are informed by the modelers’ experience and by their models—but that they are unaided by the application of forecasting principles. (page 1015) [1]

However, according to Amstrup and others’ published rebuttal in the journal Interfaces:

Green and Armstrong (2007, p.997) also concluded that the thousands of refereed scientific publications that comprise the basis of the IPCC reports and represent the state of scientific knowledge on past, present and future climates “were not the outcome of scientific procedures.” Such cavalier statements appear to reflect an overt attempt by the authors of those reports to cast doubt about the reality of human-caused global warming … [13]

Climate sensitivity is calculated by computer models and by emperical evidence. Both come within a reasonable spread from each other. It appears to be around 3*C or slightly less for a doubling of co2.

There have been relatively few Grand Solar Maxima in history. The one that spanned the entire 20th century was one of them. The nature of solar heating in the 20th century has been glossed over by most climate scientists and solar experts as well. As they believe it is a “solar flux” that we feel like weather and has no important cumulative effect whatsoever. Their ignorance has been reflected in the creation of inaccurate computer models which are only capable of fulfilling the prediction of a warming world with the meager inputs they are willing to accept.

The current small solar maximum is a good test of this long running scenario. If the earth doesn’t seriously cool by 2015, the situation for climate scientists will be quite serendipitous. All indicators on this planet are currently moving in the other direction, however.

I’d like to see your source on this. THe 5th ipcc not published yet does not have this hidden information that you know of.

http://stopgreensuicide.com/

Chapter 8: Anthropogenic and Natural Radiative Forcing

). A declining trend since 1986 in PMOD 45 measurements is evidenced in the lower TSI seen during the SC 23 (1996–2008) minimum compared to the 46 previous two minima (see also Figure 8.12): the mean TSI for September 2008 was 1365.26 ± 0.16 W m–2, 47 for 1996 was 1365.45 ± 0.10 W m–2 and for 1986 was 1365.57 ± 0.01 W m–2 (Frohlich, 2009).

That’s the sort of thing I’m talking about. Notice that the decline in TSI doesn’t actually occur until 2008, yet a calculation is done predating the effect of the decline to the inception of SC23 (1996) without any evidence of why that should be done when SC23 is only smaller by the size of the error (0.10 W m–2) , and noticeably more attenuated than SC22.

“Thus, considering the last three solar minima PMOD values, between 1986 and 2008 there is a negative RF of –0.04 ± 0.02 W m–2.”

This is typical of what we see in the entire GW endeavor. Any data point or slight bias that can be added to a calculation that can help fit the conclusion of CO2 dominance is jumped after.

See Usoskin 2010 for the conclusion about the Solar Grand Maximum (and superior detailed analysis), and Abdusamatov for the conclusion that temperatures have been controlled primarily by solar variation despite the verifiable GHG theory.

“Between 1986 and 2011, an interval that includes a substantial portion of the SC variation, a positive RF of 0.01 ± 0.005 W m–2 is calculated.” This is absurd, since the RF is actually reduced significantly from the decline of SC23 on, say from about 2003 (not 1986) through the current minuscule SC24.

The Modern Maximum refers to the ongoing period of relatively high solar activity that began circa 1900.[1] This period is a natural example of solar variation, and one of many that are known from proxy records of past solar variability. The Modern Maximum reached a double peak once in the 1950s and again during the 1990s.

##################

And the sun is declining in strength since the 1990′s and the earth continues to warm.

If you care to look at the link below, you’ll notice that they have summed the components change in forcing since the beginning of the industrial revolution. The solar component is taken into account with a minor increase overall. And yet the last 3 solar cycles show a minor decrease. This is what helps to conclude that the sun is not the major driver of the warming of our climate.

If the sun were the major driver of our climate, we would be getting stratospheric warming rather than our observed stratospheric cooling.