If You Can’t Explain It, You Can’t Model It

Source: Center for Multiscale Modeling of Atmospheric Processes - click image for more

Guest Post by Steven Goddard

Global Climate Models (GCM’s) are very complex computer models containing millions of lines of code, which attempt to model cosmic, atmospheric and oceanic processes that affect the earth’s climate. This have been built over the last few decades by groups of very bright scientists, including many of the top climate scientists in the world.

During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. This led some climate scientists to develop a high degree of confidence in models which predicted accelerated warming, as reflected in IPCC reports. However, during the last decade the accelerated warming trend has slowed or reversed. Many climate scientists have acknowledged this and explained it as “natural variability” or “natural variations.” Some believe that the pause in warming may last as long as 30 years, as recently reported by The Discover Channel.

But just what’s causing the cooling is a mystery. Sinking water currents in the north Atlantic Ocean could be sucking heat down into the depths. Or an overabundance of tropical clouds may be reflecting more of the sun’s energy than usual back out into space.

“It is possible that a fraction of the most recent rapid warming since the 1970’s was due to a free variation in climate,” Isaac Held of the National Oceanic and Atmospheric Administration in Princeton, New Jersey wrote in an email to Discovery News. “Suggesting that the warming might possibly slow down or even stagnate for a few years before rapid warming commences again.”

Swanson thinks the trend could continue for up to 30 years. But he warned that it’s just a hiccup, and that humans’ penchant for spewing greenhouse gases will certainly come back to haunt us.

What has become obvious is that there are strong physical processes (natural variations) which are not yet understood, and are not yet adequately accounted for in the GCMs. The models did not predict the current cooling. There has been lots of speculation about what is causing the present pattern – changes in solar activity, changes in ocean circulation, etc. But whatever it is, it is not adequately factored into any GCMs.

One of the most fundamental rules of computer modeling is that if you don’t understand something and you can’t explain it, you can’t model it. A computer model is a mathematical description of a physical process, written in a human readable programming language, which a compiler can translate to a computer readable language. If you can not describe a process in English (or your native tongue) you certainly can not describe it mathematically in Fortran. The Holy Grail of climate models would be the following function, which of course does not exist.

FUNCTION FREEVARIATION(ALLOTHERFACTORS)
C Calculate the sum of all other natural factors influencing the temperature
…..
RETURN
END

Current measured long term warming rates range from 1.2-1.6 C/century. Some climatologists claim 6+C for the remainder century, based on climate models. One might think that these estimates are suspect, due to the empirically observed limitations of the current GCMs.
As one small example, during the past winter NOAA’s Climate Prediction Center (CPC) forecast that the upper midwest would be above normal temperatures. Instead the temperatures were well below normal.

And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“

……

One trouble is that as some climate uncertainties are resolved, new uncertainties are uncovered.

Some modellers are now warning that feedback mechanisms in the natural environment which either accelerate or mitigate warming may be even more difficult to predict than previously assumed.

Research suggests the feedbacks may be very different on different timescales and in response to different drivers of climate change

…….

“If we ask models the questions they are capable of answering, they answer them reliably,” counters Professor Jim Kinter from the Center for Ocean-Land-Atmosphere Studies near Washington DC, who is attending the Reading meeting.

“If we ask the questions they’re not capable of answering, we get unreliable answers.“

I am not denigrating the outstanding work of the climate modelers – rather I am pointing out why GCMs may not be quite ready yet for forecasting temperatures 100 years out, and that politicians and the press should not attempt to make unsupportable claims of Armageddon based on them. I would appreciate it if readers would keep this in mind when commenting on the work of scientists, who for the most part are highly competent and ethical people, as is evident from this UK Met Office press release.

Stop misleading climate claims

11 February 2009
Dr Vicky Pope

Dr Vicky Pope, Met Office Head of Climate Change, calls on scientists and the media to ‘rein in’ some of their assertions about climate change.

She says: “News headlines vie for attention and it is easy for scientists to grab this attention by linking climate change to the latest extreme weather event or apocalyptic prediction. But in doing so, the public perception of climate change can be distorted. The reality is that extreme events arise when natural variations in the weather and climate combine with long-term climate change. This message is more difficult to get heard. Scientists and journalists need to find ways to help to make this clear without the wider audience switching off.

216 thoughts on “If You Can’t Explain It, You Can’t Model It”

Change Scenarios—- Did any of the Goverments/ Top Financial Institutions predict the Global Financial problems until it hit them ? No.
The financial system should be a much more easy to model system than climate and there would be loads of money and brain power swilling around the trough of banks/ big finance to try and make such predictions if it were possible.
But no, with all these highly paid financial whizz kids and the money put in no one said ” Houston, we have a problem”
Why would climate modellers be any better?

Sadly Vicky Pope contradicted herself within a matter of a few weeks. Just last week she said that 85% of the Amazon was going to disappear because of even modest climate change. This alarmist information came to her by way of a computer model.

Computer models cannot predict protein folding where everything is understood. I’m no expert, but I don’t understand how models can claim to predict climate when the simplist processes are not well understood. For example, I read in an oceanography text that evaporation from the open ocean cannot be accurately modelled, because the effects of wind and waves are too difficult to deal with. How can anyone make a thousand guesses and come up with something meaningful?

But on what basis can they predict that 85% of the Amazon will disappear when most of the deforestation has been from logging and nobody can tell if the feedback effects of warming will result in drought or increased humidity. It seems they approached the subject with the expectation that a positive feedback effect would result in drought, without even considering that it might be the other way around. We could tell a model that warming will result in increased precipitation, and thus rainfall. We could also tell the model that the Amazon river will overflow because of rising sea levels and thus cause floods. Or we could get a model to give us a less alarming result.

When you bundle the output of 15 models, no matter what happens, “the models predicted it.” Sometimes, I get tired of hearing that. Even I find a nickel on the pavement from time to time. (/mild mini-rant OFF)

I’m curious why the poll is present. Sure it’s a bit of fun but it is entirely useless. For one, it doesn’t have an answer I would have chosen. The closest is
“No, they are not adequate at all.” The problem is that they’ve never been tested for conformance to reality except in hindsight so their adequacy is indeterminate. The same can be said about their uncertainty. If they truly are identical models run from alternate starting points then the output indicates they are no better than a random generator or predicting the temperature next year the same as this year +/- 1C.

I suspect it is somewhat easier to model the impact of a pre-defined temperature change on a particular ecosystem, than it is to model what the temperature change is likely to be over time.

Both are impossible, intractable problems. Regarding the first: “ecosystems” are undefined conceptual entities, i.e. they don’t actually exist in the real world. No computer model of any real world area has been created that accurately predicts “the impact of temperature change.” Model predictions fail at population change, growth-and-yield, presence-absence, and every other characteristic of real world vegetation and animals.

You think climate models are bad? You should see the junk that comes from ecosystem modeling. Even the most simplified, focused, much studied problem of single species tree growth at the stand level (the basic question of tree farm management) has not produced accurate models. Imagine how inaccurate are those models that attempt to predict changes in multiple species.

Please do not assume that environmental modeling efforts in scientific disciplines other than climate are any more valid than CCM’s. They are not. We call them GAP models, for “guess all parameters.”

The Amazon models are pure speculation without foundation in the real world. Absolute GIGO.

In a prior life I created a computer model of the human visual system as a commercial product. When tuning the engine I tweaked the sensitivities and relationships of visual primitives to predict the class of objects in the field of view. I would not make a tuning move that wasn’t understood on the basis of first principles. The further I strayed from first principles on tuning any data set, the worse the performance on other data sets.

The need to understand the process in order to model it is true only where the model is expected to work on multiple data sets.

If you will tolerate a little sarcasm:
The less you know about something the easier it is to model. Just use a neural net and the model can get very good results on the training data set. Even without a neural net it takes very little skill to tweak sensitivities and relationships to obtain good correlations with the training data set.

The GCM’s developed by the IPCC were tuned to model the climate with the data that was current at the time. And sure enough the models could predict the past. Since the IPCC models were tweaked to show that global temperature rises with an increase in CO2, the models have failed (been disproved) in the years since with falling temperatures and rising CO2.

And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

“We’ve reached the end of the road of being able to improve models significantly so we can provide the sort of information that policymakers and business require,” she told BBC News.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“

Hilarious statements like that bring mathematical physics into disrepute.

In an interesting chapter entitled Engineers Dreams from his book” Infinite in all directions”, Freeman Dyson explains the reasons for the failings of Von Neumann and his team for the prediction and control of Hurricanes.

Von Neumann’s dream

“As soon as we have good enough computers we will be able to divide the phenomena of meteorology cleanly into two categories, the stable and the unstable”, The unstable phenomena are those that are which are upset by small disturbances, and the stable phenomena are those that are resilient to small disturbances. All disturbances that are stable we will predict, all processes that are unstable we will control”

Freeman Dyson page 183.

What went wrong? Why was Von Neumann’s dream such a total failure. The dream was based on a fundamental misunderstanding of the nature of fluid motions. It is not true that we can divide cleanly fluid motions into those that are predictable and those that are controllable. Nature as usual is more imaginative then we are. There is a large class of classical dynamic systems, including non-linear electrical circuits as well as fluids, which easily fall into a mode of behavior that is described by the word “chaotic” A chaotic motion is generally neither predictable nor controllable. It is unpredictable because a small disturbance will produce exponentially growing perturbation of the motion .It is uncontrollable because small disturbances lead only to other chaotic motions, and not to any stable and predictive alternative.

Or as Vladimir Arnold said (address to the Field Institute)

For example, we deduce the formulas for the Riemannian curvature of a group endowed with an invariant Riemannian metric. Applying these formulas to the case of the infinite-dimensional manifold whose geodesics are motions of the ideal fluid, we find that the curvature is negative in many directions. Negativeness of the curvature implies instability of motion along the geodesics (which is well-known in Riemannian geometry of infinite-dimensional manifolds). In the context of the (infinite-dimensional) case of the diffeomorphism group, we conclude that the ideal flow is unstable (in the sense that a small variation of the initial data implies large changes of the particle positions at a later time).Moreover, the curvature formulas allow one to estimate the increment of the exponential deviation of fluid particles with close initial positions and hence to predict the time period when the motion of fluid masses becomes essentially unpredictable.

For instance, in the simplest and utmost idealized model of the earth’s atmosphere (regarded as two-dimensional ideal fluid on a torus surface), the deviations grow by the factor of 10^5 in 2 months. This circumstance ensures that a dynamical weather forecast for such a period is practically impossible (however powerful the computers and however dense the grid of data used for this purpose)

Fortunately we can understand why mathematics is a limiting quality for the MET.

If the underlying principals are not well understood, the model can only spit out guesswork. The effort, though herculean, is rather misdirected. Focus on doing the work that connects and follows all the paths. They did quite the job on DNA. The same amount of work needs to be done on climate before the models can be used for practical purposes.
We have had quite enough of hitting the panic button both ways: Ice Age Coming and Global Overheating.
Enough of this already.
Get back to work.

You can purchase stock prediction models that use a sixth or seventh order back-fitted equation, which is blindingly accurate looking backwards. Unfortunately they are no better than a random number generator looking forward.

“With four parameters I can fit an elephant, and with five I can make him wiggle his trunk”
– John Von Neumann

The IPCC model is a very expensive video game. Something the kids would play on Wii or Nintendo. We can ill afford for grown men to be wasting time playing with such things. The CO2 lag was already known from the Ice Cores.
Get back to work.

Would anyone agree with me that it is absurd to claim (or believe) the science is settled, when there exists multiple climate models (GCMs), each from a learned scientist or group, but none of which agree with the others?

The very fact that there are conflicting models is proof positive the science is not “settled.” This fact is quite useful to attorneys in a trial; we simply bring in an expert witness with a different, but equally credible, opinion. In complex trials, there may be multiple expert witnesses, each with a view that conflicts with the others. At no point will the judge accept any single view as “settled science.”

In the legal realm, a judge may take “judicial notice” of some facts, and does not require presentation of evidence for proof of them. In California (the other states use a similar definition), judicial notice is granted for “Facts and propositions that are not reasonably subject to dispute and are capable of immediate and accurate determination by resort to sources of reasonably indisputable accuracy.” (source: California Rules of Evidence 452(h) )

In the scientific realm, a model of gravity, coriolis, and air resistance can be used to predict quite accurately the landing spot of a ballistic shell shot from a cannon, given the starting conditions such as projectile mass and volume, muzzle velocity, angle of aim, and drag coefficient. THAT is settled science.

I’ve often wondered who actually thought they could model climate in the first place. The best I could figure is someone without a clue about computer models but a good deal of salemanship sold the idea somewhere. The rest, unfortunately, is history.

We see the AGW believers admit the models are worthless every time they open their mouths and say the climate is getting “worse” than predicted. Of course, most of them don’t realze they are damning the very work that is being used to support their cause.

This whole article is a very very good analysis of GIGO. Garbage in garbage out. Well meaning idiots relying on inadequate computer models have destroyed the global economy and are hell-bent on preventing any recovery due to planned excessive carbon taxes.

I used to think GCMs were Global Climate Models; but then I kept reading papers from people who presumably know better; and they were calling them Global Circulation Models; NOT Climate models.

Well on planet earth it seems that at reasonable (non-geological) time scales, we have basically Ocean Waters, Atmosphere, and perhaps energy that are capable of Circulating. At longer geological time scales, the land itself is circulating; so let’s not go there.

Well it seems to me that in order to get circulation of either water, or atmosphere or energy, you MUST have differences in Temperature both in time and in space.

But limate we are told (by definition) is the long term average of weather. Therfore it is ideally a single number; not a graph, which implies changes over time or space; which would be weather rather than climate.

So climate and circulation seem to be mutually incompatible concepts. You can’t get circulations or model them, without having temperature (and other) differences in time and space, and that means averaging is verboten, because it eliminates differences.

A model that depicts the earth as an isothermal body at about 15 deg C, that radiates 390 Watts per square meter from every point on it 24 hours a day doesn’t have any temperature differences in time or space to use with the laws of Physics to ompute circulations.

Yet an untoward focus ofclimate science, and certainly climate politics rests on what happens to a single number that gets updated each year or maybe monthly or maybe each five years; namely the GISStemp anomaly fo Dr James Hansen at NASA.

Do any of thse vaunted GCMs whatever they are, predict both the mediaeval warm period, and the little ice age; the Maunder and dalton minima; or any other widely recognized instance of a climate change, that presumably had a natural reason.

As for the good Met Office lady’s request for more powerful computers (and any other form or research (taxpayer) dollar expenditures; Nonsense; those more powerful computers will simply spit out garbage at a much faster clip; but it will still be garbage, because clearly the models are NOT adequate to explain what is happening, because there isn’t even scientific agreement, on what the key Physical processes even are; let a lone any experimental verification of the extent to whih any such processes are operating.

The only climate or circulation model that works; and it works very well, is the model that is being run on the data, by planet earth with its oceans, and atmospheres, and the physical geography of its land masses; not to mention all the biological processes going on at the same time. Well then there’s the sun of course; which ranges from the cause of everything; to having nothing to do with earth’s climate.

When the GCMs make even a token effort to emulate the model that planet earth is running; they might start to get some meaningful results.

But I can assure you that faster computers are neither the problem nor the solution. Computers can be made to do the stupidest things at lighning speed; the trick is to have them not doing stupid things, in the first place..

The old astronomy professor was addressing his brand new freshman astronomy class for their first lecture.

“when I was an undergraduate like you all, ” he said, “we could write down the universal equations of motionon the back of a postage stamp; but it took us six months with a battery of students working with log tables to solve those equations, and determine the motions of the universe..”

“Now that has all changed; we have modern high speed computers that can solve those dynamical equations in microseconds; but now takes a battery of students six months to program those computers with meaningful code to even start working on those problems of the universe.”

I know a little bit about computer modelling. I spend hours each day, modelling optical systems with a quite powerful yet desk top computer (eight Intel Processors). Well there are a number of other people in the company, that have the same lens design software, and maybe not quite so powerful computers; who presumably could do the same things I do, with my computers; but they can’t; and they don’t.

You see they are all mechanical packaging engineers; who have been told that optics is just packaging with transparent materials. They can’t do what i do, because they don’t know a darn thing about the Physics of Optical systems. What’s more, they don’t know that that is why they can’t do what i do; it sin’t my eight core computer at all.

What has always just floored me about the debates over models and global warming has been the arrogance of some people (usually politicians and not scientists, but some scientists too) who evidently believe that we have a good enough understanding of the Earth’s climate to be able to design a computer program that can predict what temperature it will be in 2100 (or beyond!). We can’t predict the weather accurately tomorrow, yet we know what will be in 2100. We have zero evidence that any climate model can accurately predict the weather next year, but they surely can predict 2100. Breathtaking arrogance.

Natural systems are amazingly complex things. They aren’t easily understood. There are so many factors influencing global climate (if there is such a thing), and we don’t understand them all…not to mention there are almost certainly factors that we don’t even know exist that influence it, and new ones are discovered from time to time.

It’s time not to shut up about varying beliefs and opinions, but to display a little humility. And to stop politicians from hijacking this issue in an attempt to push a political agenda that has origins far from climate.

Quote:
“The study, which has been submitted to the journal Nature Geoscience, used computer models to investigate how the Amazon would respond to future temperature rises.”
—————————–

Did the models account for the carbon used by lots and lots of petrol driven chainsaws? or the action of their teeth? It is not the (mostly) natural and modest level of climate change that is harming the Amazonian rain forests. It is the action of thousands and thousands of loggers cutting the trees down, leaving huge gaps in the canopies that do not trap evaporating water and the ground dries out so that the rivers feeding the mighty Amazon also dry up.

Mankind’s activities are indeed damaging the Amazon rain-forest, but it is NOT our production of CO2 that is to blame.

Your poll is based on a false premise. There is no single GCM so there is no single prediction. A large number of modelled outcomes with no logical basis to prefer one over another is no prediction at all. If I claimed to be a psychic and you asked me to divine which card you will choose from a deck, you wouldn’t be impressed by my psychic powers if I rattled off twenty or thirty possibilities as my answer even if one of them turned out to be right.

The thing is we have been through this before, in biochemistry. In the late 60’s and early 70’s people believed that they could model non-equilibrium thermodynamic processes using constants derived from equilibrum theromdynamics. They couldn’t.
Kacser, H. & Burns, J.A. ((1973) Symp. Soc. Exp. Biol.27, 65-104) therefore came up with metabolic control theory, now called metabolic control analysis. This is the way forward, use steady state models and recognize that you are dealing with an non-equilibrium system.
It is not hard to realize that these models are fundamentally, thermodynamically, flawed.

Let me check if I understand this correctly. Those “leading climate scientists” produce climate models which predictions range anywhere between oceans freezing and boiling. As general public is eager to know what it is going to be, more funding is just what is needed to solve the problem?

I find it interesting that people believe they can model our entire climate system when in fact they cannot even model hurricane activity with any reliable precision. Take hurricane Katrina for example (the AGW poster child), models were not (and are not) able to accurately predict landfall location nor intensity any further than 24-48hrs. out. We are talking about a tiny fraction of our climate system here, and one that is observable over a series of weeks, and yet if you look at hurricane path predictions, they are quite vague, results vary greatly between models (that is why they average several) and those predictions are updated and change frequently. Hurricane models have been around for quite a long time and have probably improved more than almost any type of chaotic system model. They are getting pretty good, in relative terms, however, even those models are far from what I would call “accurate”.

This all begs the question, if models cannot predict hurricane paths and intensity with any discernible accuracy outside or 48hrs., why would people have so much faith in predictions of models that are much more complex, and try to predict global temperatures to the 1/10th of a degree, precipitation in every region around the world, sea levels around the world, polar ice coverage, glacial ice content around the world, and more…, and make those predictions for 50, 100, 200, 1000 years time out?

I would question the mere sanity of any individual that would assert such a notion.

If burning all the fossil fuels currently known to exist fails to make the ocean acidic (it would actually end up slightly alkaline, assuming sea critters and plants did nothing to sequester any of it) what possible purpose is gained by such a demonstration?

Global climate models are extraordinarily useful tools, and among other things have successfully predicted the rise in temperature as greenhouse gases increased, the cooling of the stratosphere even as the troposphere warmed, the increase in the height of the tropopause, polar amplification due the ice-albedo, and other effects, and greater increase in nighttime than in daytime temperatures, and the magnitude and duration of the cooling and the water vapour feedbacks caused by the eruption of Mount Pinatubo.

One is tempted to ask what is meant by ‘the current cooling’ given that the 18-month, 10 year, 20 year and 30 year trends are all positive. Is it the modest cooling trend that has occurred over the last 8 years or so? If so it is not true, nor particularly relevant, to say that no model predicted this. IPCC projections are based on the aggregated results of many model runs, averaged out. Some of these models did predict the cooling trend, while most did not, however over such a short period the ‘noise’ is large compared to the long term signal. Here James Annan shows that since 2001, 95% of the model predictions are within the uncertainty band of the observations. In normal statistical usage that means that the two are consistent. Lucia agrees..

Using all the above, I find that the best estimate of the underlying trend in GMST based on the average of 38 runs (2.22 C/century ) is not inconsistent to a significance level of 95% with the observed trend of -0.59 C/century.

Climate is by definition a long term phenomenon and we can see the longer term model-based projections made by the IPCC here. The midrange scenario A2 projects a temperature rise from 1990-2010 of 0.35C, equivalent to a linear increase of 0.175C per decade. How are they doing? Well the trends in the 4 main indices since 1990 are:

UAH 0.168
Hadley 0.171
GISS 0.183
RSS 0.185

(That GISS is such an outlier huh? ;-). Do these observations increase or decrease our confidence in the models?

And Julia Slingo from Reading University (Now the UK Met Office’s Chief Scientist) admitted it would not get much better until they had supercomputers 1,000 times more powerful than at present.

…

Hilarious statements like that bring mathematical physics into disrepute.

There are physical reasons beyond the issues of dealing with a chaotic system, which I assume is the reason for your mirth. Current models use a grid that’s too coarse to model thunderstorms. Using a 10X finer resolution would be a big help, but that means a thousand-fold increase in the number of data points if all three axes were extended. More if you increase the time resolution. (I suspect you could make a decent case for 5X the vertical resolution and 2X the time resolution.) Poor resolution also make modeling hurricanes atrocious, and those move a huge amount of heat and water poleward.

I don’t see the chaotic effects as being significant – we’re not forecasting the weather, we’re forecasting the boundary limits and chaotic attractors, i.e. climate.

Dr Vicky Pope also contradicted herself with a MET OFFICE press release for the recent Copenhagen conference. The first sentence reads :-
“If greenhouse gas emissions continue to rise at current rates, then global temperatures could rise by more than 6C over this century”
The last sentence is slightly modified :-
“Dr Pope warns: If the world fails to make the required reductions. it will be faced with adapting not to just 2C rise in temperature but to 4C or more by the end of the century”
To me this is the very same alarmist propaganda which she has earlier criticised.

I have to say that a model using neural nets, as Tsonis et al have done, if used judiciously so that the differential equations that enter the problem are also simulated , would be a better direction to go in creating new models.

I have been asking whether an analogue computer could not be devised for climate, where the coupled differential equations are entered as electric elements , and be more successful and economical in describing the time evolution of climate.

On the subject of processing power, many of Hansen’s predictions were based on very early models running on computers that were woefully underpowered compared to today’s. Yet years later he is still standing by the output of that crude technology.

I think you’re overblowing it with the “highly complex” and “millions of lines of code” stuff. As you should know, whenever there is an overly massive amount of code then it usually means that most of it is unnecessary bloat from being badly written. The fundamental equations and computations that are used in these models can be written in very few lines of code. We don’t know so much about about Hadley (except that it’s in Fortran too) but GISS is 70’s Fortran, largely written by students, poorly documented, with massive amounts of redundancy and duplication, some obviously unchecked coding errors and very little in the way of validation. It is very far from being commercial quality code, or even good academic code (of which there is a lot). Modern multiphysics or fluid dynamics software and even those ultimately crappy financial models are all a good deal more complex/sophisticated and written by far better programmers. In fact I’ll guarantee that the coding for the movie Shrek was about 100 times more mathematically complex. As for modeling the atmosphere with 3-deep elements, each one the size of Florida – words fail me.

Perhaps our alleged Prime Minister could arrange to meet him and ask him to take the DVD’s Mr. Obama loaned him on his recent trip to Washington back the the White House video library. Brown must have had time by now to watch all of them. I assume Barry’s gift was just a holding gesture whilst he arranged to make Hansen available.

Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?

“Sadly Vicky Pope contradicted herself within a matter of a few weeks. Just last week she said that 85% of the Amazon was going to disappear because of even modest climate change. This alarmist information came to her by way of a computer model.

The study, which has been submitted to the journal Nature Geoscience, used computer models to investigate how the Amazon would respond to future temperature rises.”

1) The unpublished paper is yet to pass “peer review” as i understand (Revkin NYT)

2) The ‘model’ (without having to even read it) will encompass anumber of mathematical fictions eg Kolgomrov 1956

3) The “model” will violate the second law eg Brillouin, L., 1956,Morowitz 1989

4) There is NO reducible algorithm ( The complexity described cannot be described more concisely then by writing down all the observed phenomena) eg Monod 1972 IE 10^27 DEGREES OF FREEDOM.

5) Point 4 brings to the front the Turing-Kolgomorov-Chaiten complexity problem

If all the Amazon rain forest is cleared it will get hotter because the sun is now shining on exposed soil. Maybe CO2 might do something as well, but the sun will heat the ground with immediate affect. In climate models forest is given the same albido rating as cities. I do not believe this is correct. I used to go gliding and became reasonably proficient at staying up by using thermals, on one case for over 6 hours. The strongest thermals were always from buildings, roads, car parks etc. The next source was open farmland. I never experienced any thermals from forest although I understand more experienced pilots can get them.
What concerns me is we will focus on fixing CO2 and at the same time continue all the other land clearing activities including draining, that will definately make it hotter. We view all these activities as local and therefore can be dismissed. When such clearances can be measured in size by using countries for scale then the effect ceases to be local.
The arguments over GW have become partisan and are not helping anyone. Is this how it was on Easter Island? Maybe they new something was not right but they cleared all the trees anyway.

“as Vladimir Arnold said (address to the Field Institute)
For example, we deduce the formulas for the Riemannian curvature of a group endowed with an invariant Riemannian metric. Applying these formulas to the case of the infinite-dimensional manifold whose geodesics are motions of the ideal fluid, we find that the curvature is negative in many directions. Negativeness of the curvature implies instability of motion along the geodesics (which is well-known in Riemannian geometry of infinite-dimensional manifolds). In the context of the (infinite-dimensional) case of the diffeomorphism group, we conclude that the ideal flow is unstable (in the sense that a small variation of the initial data implies large changes of the particle positions at a later time).”

I’ll be able to sleep well tonight now I know that. Then tomorrow I can set about learning Swahili so that I can translate it. Or maybe it’s the world’s longest anagram. Either way, a fun day beckons.

Incidentally, Mr Sowell (at 11.37.18) mentioned judicial notice. I was once involved in a trial where one of my opponents was trying to persuade the judge that he could not properly take judicial notice of a particular matter. In the course of his argument he was endeavouring to explain what “judicial notice” means and said “For example, Your Lordship would not be able to take judicial notice of the fact that Arsenal beat West Ham United at football yesterday.” To which the judge replied “I wouldn’t need to, I was at the match.”

I don’t see the chaotic effects as being significant – we’re not forecasting the weather, we’re forecasting the boundary limits and chaotic attractors, i.e. climate.

The divergence from reality (from unknown quantities of ‘natural variation”) at present being one level of uncertainty.

The level of mathematical skill from the “proprietors” of “state of the art gcm” being another as Dymnikov 2007 observes.

” It should be emphasized that most of the scientific community concerned with climate models are generally not interested in mathematical problems and treat a model as a specific finite-dimensional construction with a specific description of physical processes, thus reducing the study of the model’s quality to a numerical experiment alone “….

…It is necessary to prove some statements for the system of partial differential equations describing the climate system’s model.

I. The global theorem of solvability on an arbitrarily large time interval t.

Unfortunately, there is presently no such a theorem in a spherical coordinate system with “correct” boundary conditions. This is not a consequence of the absence of such theorems for three-dimensional Navier–Stokes equations. The equations of the state of-the-art climate models have a dimension of 2.5 because the hydrostatic equation is used instead of the full third equation of motion.

We don’t know so much about about Hadley (except that it’s in Fortran too) but GISS is 70’s Fortran, largely written by students, poorly documented, with massive amounts of redundancy and duplication, some obviously unchecked coding errors and very little in the way of validation. It is very far from being commercial quality code, or even good academic code (of which there is a lot).

Indeed. I used to write Fortran 77 code and if we assume the skeleton function in the article is supposed to be F77, I can identify at least 4-5 formal issues with it (and I am ignoring the “….” ellipses), even though it does nothing. One issue is fatal, it isn’t going to compile. Writing commercial quality code is harder than most people think :-)

Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?

That’s nothing. Ed Milliband said young British activists should take Direct Action to other countries to bully them against using their resources. Talk about imperialism. I am a very patriotic Brit and this kind of thing is completely out of line with post-imperial British values.

Which model predicted that?
I would be more inclined to rely upon the natural world around me, like the behavior of animals & plants, rather than on a computer model.
The flora & fauna are properly programmed, the computer models are at best guesswork based on incomplete information. They are in thier infancy.

Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?

Maybe because he didn’t do so hot over here leading charges to shut down the Power Plant for the US Captial. i.e. – he didn’t cross the Delaware in winter.
I will agree with the above posts: He should be fired. He should have been fired a long time ago.

“Global climate models are extraordinarily useful tools, and among other things have successfully predicted the rise in temperature as greenhouse gases increased, the cooling of the stratosphere even as the troposphere warmed, the increase in the height of the tropopause, polar amplification due the ice-albedo, and other effects, and greater increase in nighttime than in daytime temperatures, and the magnitude and duration of the cooling and the water vapour feedbacks caused by the eruption of Mount Pinatubo. ”

I am always amused to read these kinds of claims, only to find that it’s just another climate hindcast. That is, you know the answer before you run your model. Nothing wrong with that, except it is NOT a prediction! It does allow you to tune you model to get perfect agreement with the past (imagine that)…

For those who want a more sober assessment of the predictive skill of modern climate models should read this:

If you remember the post about adjusting temperatures for the ENSO and AMO, the reconstruction model continues to work very well with the February Hadcrut3 numbers being out by only 0.014C and with GISS by 0.035C

The climate modelers need to adjust out the natural variation in the climate before they run their models/hindcasts (because it is true they were taking a free ride with the rise in ENSOs and the AMO from 1976 to 2006).

Warming is only half what the models predict.

Here is the latest Hadcrut3 reconstruction.

Here is GISS Model E for comparison.

Here is what the residual warming left over shows (after adjusting out the impact of the ENSO, the AMO and the southern AMO which I added later) compared to the original global warming models. The modelers need to use this as their starting data and then they would be more accurate.

The satellite temperature trendlines are much lower than the Hadcrut3 reconstruction shows – between 0.8C to 1.2C per doubling.

Steven Goddard (13:53:32) :
John Philip,
You said that some of the models did predict the recent cooling.
What was the reason for the cooling, and what do those same correct models forecast for the remainder of the century?

You beat me to the question, Steven. The second and third questions then are, which output is correct and how do we choose?

I wanted to take a look at one of the references you gave (Lucia). Most of it is outside my area of expertise. But the author also said this.

“However, it is worth nothing that at a significance level of 90%, the best estimate of the underlying trend in GMST based on the average of 38 runs is inconsistent with the observed trend. Using the language of the IPCC, the result of this test suggests we should have very low confidence that model projections based on this sort of multi-run average are correct.”

“So, what we are finding is that if I try hard enough, I can currently hunt around for methods of averaging that fail to falsify the models.”

This sounds inconclusive. But I am willing to try other references. (or be corrected).

Why IS the US government paying one of its employees (one way or another) to act as a focal point for troublemakers in another country? Is this a new direction for foreign policy?

Because nobody in this country has the gonads to do the right thing and fire the bastard!”

As a Brit I really resent the fact that a govt employee of a friendly power can be sent to its major ally to cause trouble and try to tip us into the dark ages through denying us much needed energy. Can’t you control the guy?

Not so long ago I began working on computer models using my old, but trusty Commodore PET with the aim of predicting exactly of what the moon is made.
As it turns out the models, LCM’s [Lunar Cheese Models] ran very well., but with some differences in output, depending on the milk type, temperature, salinity, and bacterial type used. Sometimes the output indicated the moon was made of Swiss cheese – which fit well with the observed cratering; possibly from earth sourced atmospheric bleed-over of CO2. Other times the model runs suggest a composition closer to double cream Brie, which, with minor solar irradiation would flow and form the observed maria. Other runs are consistent with a sharp crumbly English cheddar [aged at least two years] resembling the lunar mountains.
But, puzzlingly often, the LCM output indicated a new cheese with a very foul “runaway smell” index. To this runaway smell cheese I have tentatively assigned the folk taxon Moon cheese.
Now, after running these models many times, I am absolutely sure they yield the correct basic chemistry of the moon — in spite of real data returned from Tranquility Base through Taurus-Littrow.
No, before you ask, I will not make public the data input for these models or how I may have adjusted them to get the proper output. Nor will I answer questions about the model’s accuracy. Just trust me.
And now, given that my models are correct, I intend to lobby Congress and the executive branch to immediately stop all attempts to return to the moon as any large scale lunar commerce would represent a dire threat to earth based cheese makers. If successful in lobbying, I shall submit proposals for grants for further research in Moon cheese.
The LCMs also predict that Moon cheese will occur from time to time on the earth itself and probably in ever increasing amounts. We should, therefore, take immediate and strong action to prevent this from happening as the runaway smell could easily make the earth uninhabitable within the next few decades. Curbing the runaway smell output might well be enforced by requiring every citizen to buy “cut the cheese credits” [CTCC’s]. The obvious benefits of CTCC’s to all mankind are self-evident.
We have at most 5–7 years before reaching the tipping point for this phenomenon Many readers may have doubts about the seriousness of the imminent runaway smell problem, but let me suggest that if you have ever been “mooned,” you probably recognize the smell whereof I write. So keep careful watch and do your best to avoid the Moon cheese! And watch where you step too.

Steve I think you missed my point. I was making the observation that vinegar is at least 5pH units more acidic than seawater. So it was a fatuous and misleading exercise for CCMAP to the liken ocean acidfication to this degree .

“Just a shame that in the UK’s Kings North power station vandalism trial only one view was on show. The prosecution wasn’t about to disturb the consensus and put up anyone to dispute Mr Hansen’s views.”

I agree, it would have been great to have a well-spoken, knowledgeable expert to provide balance.

“That’s nothing. Ed Milliband said young British activists should take Direct Action to other countries to bully them against using their resources. Talk about imperialism. I am a very patriotic Brit and this kind of thing is completely out of line with post-imperial British values.”

First thing Milliband’s got right.

Let them try it in China for instance, it would keep them out of OUR hair for a year or two ;-)

actually the models include short time spans showing sinking temperatures…but you knew that, of course?!?

I don’t see any period of sinking temperatures longer than 2 years after the year 2000 in that graph, and we are several years into a flat to slightly declining trend. Unless of course you are talking about Scenario C, in which case the only excuse is that you are being deliberately misleading.

1. The IPCC uses a wide range of model outputs, ranging from little warming to a lot of warming.
2. Because the measured temperatures were close to some of the lower forecast models, that validates the higher forecast models.
3. Thus we can expect 6+C warming this century

I predict that the stock market will either go down, up or stay the same next week. Those are my three scenarios, and I’m sticking with them through thick and thin.

“As a Brit I really resent the fact that a govt employee of a friendly power can be sent to its major ally to cause trouble and try to tip us into the dark ages through denying us much needed energy. Can’t you control the guy?”

Many of us Americans certainly resent it to. Hansen is an embarrasment to science, to NASA, and to the U.S.A.

I have to say…when the cat left, the mice started to play.

As dislikable as the Bush Administration was to many in the world, notice Hansen did not really start going hog-wild, until AFTER the current president took over the White House.

Why does Hansen do it? Because he knows he can.

But we share your sympathies. Fire him….and TRY him, if necessary.

Try him for what? MALFEASANCE, INCOMPETENCE, and WANTON DISREGARD FOR PUBLIC OFFICE.

History will not look too kindly of him….or his chums Gore or Holdren.

But they are one in the same: politically driven-ideologues who squelch free thought and TRUE scientific discipline.

The shameful “American Inquisition” in action!

The fact that the climate models have failed to even extrapolate a warming trend in the past 10 years, and still, in spite of that, where they not backing down…can only mean one thing: megalomania.

Too bad there is not a model to accurately predict the behavior of politically-driven scientists as to what they might do next!

Latest Hansen Short and Long-range Forecasting Outlook:

HANSEN POLITICAL DEVIATIONS ABOVE NORMAL ARE APPROACHING RECORD MAXIMA SINCE RECORDS HAVE BEEN KEPT.

PROTEST IN THE U.K. HIGHLY LIKELY AND IMMINENT. OTHER PROTESTS ARE POSSIBLE IN THE LONG-RANGE, BUT THE EXACT TIMING AND INTENSITY OF THOSE HANSEN APPEARANCES WILL DEPEND UPON OTHER FORECASTING VARIABLES DOWNSTREAM IN THE LONG RANGE AND GENERAL MODEL AGREEMENT AFTER SUCH EVENTS HAVE MATERIALIZED.

THIS IS A HIGHLY VOLATILE AND UNSTABLE SITUATION…STAY TUNED TO YOUR LOCAL WUWT FORECAST FOR DETAILS IN THE EVENT A WARNING IS ISSUED….

The midrange scenario A2 projects a temperature rise from 1990-2010 of 0.35C, equivalent to a linear increase of 0.175C per decade. How are they doing? Well the trends in the 4 main indices since 1990 are:

UAH 0.168

I checked UAH and they show a DECREASE in global temperature of -0.04C (-0.02C per decade) from 1990 to 2008 (the latest full year available).

As a Brit I really resent the fact that a govt employee of a friendly power can be sent to its major ally to cause trouble and try to tip us into the dark ages through denying us much needed energy. Can’t you control the guy?

Ironically, it may actually be better overall to keep him where he is and allow him to continue his ridiculousness. By doing so, he is simply undermining his own cause as he continues to look more foolish every day. A clear case of “scientist gone mad”.

sod – am I right in thinking that scenario A is the one in which CO2 emissions continue to increase unabated (“BAU”), scenario B is for CO2 emissions stabilised at around 1990 levels, and scenario C is for CO2 emissions drastically cut?

The observed temperatures are closer to scenario C than either of the others, but where are the observed CO2 emissions?

Bill, those are wonderful graphs. I can see that those models are able to reconstruct the past quite accurately, but that does nothing for forecasting the future. I can take a pencil and sketch a graph of the past myself, equally as accurate, and I know little to nothing about how our climate works. I can probably also predict the future like the models, with likely equal, if not better accuracy.

As a Brit I really resent the fact that a govt employee of a friendly power can be sent to its major ally to cause trouble and try to tip us into the dark ages through denying us much needed energy. Can’t you control the guy?

I keep hoping he’ll show up on a no-fly list, preferably some day when he’s out
of our countries.

Hansen’s Scenario C was for GHG levels to stop increasing in the year 2000.

It then took only 6 years for temps to reach a peak and then stay there.

For those who keep pointing to the long timelines to reach equilibrium, Hansen estimated it only took 6 years back in 1988 when he was 100% sure he had everything figured out and even testified to Congress to that fact. He has now changed the equilibrium response time to 1500 years (but he has “nailed it” for sure this time).

Well, call me simple minded or whatever, but when the amount of experience, time, money and people involved in near-time weather forecasting cannot accurately predict what the conditions are going to be 5 days from now, why should I trust our Climate experts to be accurate 10 – 20 – 50 – 100 years from now? I know a lot of the experts claim that Weather forecasting and Climate Forecasting are different things entirely. Well, Math and Physics are also separate, but if you can’t do the math accurately you aren’t going to answer many physics questions.

Bill Illis (16:04:30) :
Hansen estimated it only took 6 years back in 1988 when he was 100% sure he had everything figured out and even testified to Congress to that fact. He has now changed the equilibrium response time to 1500 years (but he has “nailed it” for sure this time).

Question for Anthony possibley concerning models. I believe that it was almost a year ago that you published a letter regarding the Aqua Satellite results. As I recall, they were not being released pending further peer review since the findings were at odds with current thinking [models?]. Any update on the status?

JohnG, In my experience giant code is often not “just bloat” — it’s hacking. Like for instance “if the surface of the ocean at the tropics freezes force fix the data…” That’s the kind of thing I’d be looking for if I audited the code. I agree that the code really *should* be simple to have any hope of validity.

I have always had a few questions regarding the modeling processes used by the IPCC, and although these questions are argumentative in that they reveal my opinion about the usefuness of climate models, perhaps someone here is in a position to address the factual bases for these questions or explain why the questions might be misplaced.

Based on a very cursory reading of the latest IPCC report, it is my understanding that the conclusion of a causal effect between GHGs and warming is premised in part on a finding that exising climate models cannot simultate current climate trends without a substantial CO2 warming effect. Due to the chaotic nature of the climate system, and the unknown nature of what initial conditions to use in the models, a model is “run” many times with different intial conditions to produce a range of climate trends, and the reasoning is that, because none of these model runs show the existing warming trend without the modeled CO2 sensitivity, then it is reasonable to conclude that the existing warming trend in the real world is caused by CO2.

My first question is whether I have fairly summarized both the modeling process of the IPCC, and the reasoning behind it’s finding of a causal effect between CO2 and warming; because if so, this logic seems flawed. All this exercise demonstrates is that the models used by the IPCC require a substantial CO2 contribution to temperature in order to reasonably fit known climate trends. Nothing more. It certainly does not demonstrate that NO climate model could be constructed that both reasonably fits known climate data and uses an insignificant sensitivity to CO2. In other words the premise that models A, B, and C cannot simulate climate trends without relationship 1 does not logically support the conclusion that relationship 1 holds true in the real world system that the models A, B, and C attempt to simulate. Models A, B, and C may be simply wrongly constructed in one or more respects.

This leads into the second question. Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.

My last two questions are a little more general. What possibile utility is there in modeling a relationship that cannot be physically measured, either directly or indirectly, e.g. the temperature response of the Earth as a function of CO2 concentration? If you don’t have the ability to measure a physical system at a level of detail that you need to perform a task, why would someone think that this deficiency could be overcome by simply modeling the system at the desired level of detail and testing the model against the things you can measure?

Finally, given that a computer does nothing more than what the programmer instructed it to do, how can merely running the computer model tell you something that you did not already know or assume about the system modeled? I can accept that a computer model can be highly useful as a shortcut in performing tasks too complex to be reliably done manually, once it has first been established that the computer model is accurate with respect to the specific task at hand. I have a much harder time believing that a computer model can TEACH you anything about the system modeled, particularly with respect to modeled relationships that can’t be independently verified apart from the simulations.

That nice Mr Sowell commented (14:57:20) :
“@Mike T (12:51:52) :
“Just a shame that in the UK’s Kings North power station vandalism trial only one view was on show. The prosecution wasn’t about to disturb the consensus and put up anyone to dispute Mr Hansen’s views.”
I agree, it would have been great to have a well-spoken, knowledgeable expert to provide balance.”

Such evidence would not have been admissible. See this and the comments:

“Incidentally, Mr Sowell (at 11.37.18) mentioned judicial notice. I was once involved in a trial where one of my opponents was trying to persuade the judge that he could not properly take judicial notice of a particular matter. In the course of his argument he was endeavouring to explain what “judicial notice” means and said “For example, Your Lordship would not be able to take judicial notice of the fact that Arsenal beat West Ham United at football yesterday.” To which the judge replied “I wouldn’t need to, I was at the match.””

Your opponent was either a brave man, or foolhardy, I would say. To attempt to explain to a judge what judicial notice means very likely would result in unfavorable rulings from that point on, as the judge would be insulted!

I often wonder about the typical definition of climate vs. weather. Why is climate 30 years or 60 years and not 1000 years? Do we really understand whether the last 50 years is not just weather?

This notion that we understand enough to determine what climate is is itself QUESTIONABLE . It seems to me that climate really should be closer to 1000 years than anything else I’ve heard. Of course, that would be a useless time frame so we’ve conveniently made it shorter. However, by doing this we may be confusing the picture which makes modelling a tricky job. We look for short term climate effects and probably miss other contributors.

“when . . . people involved in near-time weather forecasting cannot accurately predict what the conditions are going to be 5 days from now, why should I trust our Climate experts to be accurate 10 – 20 – 50 – 100 years from now? I know a lot of the experts claim that Weather forecasting and Climate Forecasting are different things entirely.”

Lets say that I have a very weak, repetitive, audio signal that I’m sending across the country. On it’s path, that audio signal is subject to quite a bit of random noise. When received, over short intervals the signal is going to represent mostly the random spurrious noise, and reconstructing the repeating pattern will be almost impossible. With sufficient time, however, the original signal can be reconstructed because it’s pattern holds over time while the random noise cancels itself out. Once reconstructed, you can acurately predict the up and down trends of the signal so long as they are avereraged over a sufficient interval, but accurately predicting any instantanous point will be impossible.

Notice in this example, however, that the source of the signal and the source of the random noise are mathematically uncorrelated. This is what permits you to separate the short term random noise from the long term repeating signal. The same physical phenonena that produces weather however, produces climate. In fact, weather is nothing more than the instantaneous manifestation of climate. Those who argue that long term climate is predictable even though short term weather is not, are essentially making an unstated assumption that the chaotic nature of the earth’s climate system fades out over frequencies longer than four decades or so.

Which side of the argument you take depends on how reasonable you think this assumption is. I think it’s a lousy assumption. For example, when I look at long-term climate reconstructions, I don’t see any pattern in the various climate optimums and minimums. The standard deviation for the length of ice ages/interglacial periods, from eyeballing these reconstructions, also seems a lot longer than four or five decades. All of this seems to indicate to me that there is a lot of randomness in the Earth’s climate system that is not removed by taking averages over a period of decades.

kurt — Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.

That’s the thing. These models were written from the ground up to simulate climate response to CO2. It’s what they do; why they exist. That anyone is surprised when the answer to the question is CO2 still amazes me.

Were you to make a model simulating traffic patterns where the primary variable under investigation is driver stupidity, the answer you will derive — no matter what — will be some form of driver stupidity. This will be true even if the traffic patterns are more influenced by construction timing, flow control and so on. If you’re looking for driver stupidity, you are guaranteed to find it.

If these models were designed originally to calculate albedo averages and the answer came back as CO2, this would be impressive. But that’s not what is happening. All these people are doing is using the word “computer” and claiming “really truly complicated science” and expecting us to all say “ooooh, an impartial computer said it so it must be true.” It’s psychological voodoo, not science.

That’s very encouraging. Now that my solar system orbital simulation model is working tolerably well, and I have a planet Earth in it spinning once every sidereal day, and orbiting just about once every year, I’m thinking that I may extend it to include the terrestrial atmosphere, which I’ll divide up into layers in some sort of grid. Knowing the thermal conductivity and heat capacity of air, sea, and land, and the incident solar radiation at any point on the planet surface, and the heat losss from the planet interior, I’ll be able to work out the conductive heat flow vertically to outer space at absolute zero, and horizontally through the atmosphere and ground.

I’m not sure at the moment what to do about air flows using this grid system. I’m wondering if, instead of having a fixed rectangular grid, I might have mobile air masses which would ‘jostle’ with each other as they expanded and contracted, no doubt forming warm and cold “fronts” surging to and fro, and spinning clockwise and anticlockwise under the influence of coriolis forces, and very likely northern and southern “jet streams” as well.

That should all be working by next Thursday. Then I’ll add in clouds and precipitation. I’ve heard this is difficult to do, but I can’t see any big problem there. As my mobile air masses jostle with each other, the warmer lighter humid ones will float to the top of the atmosphere, and will turn white as the water in them condenses into clouds and rain. I expect I’ll have cirrus and stratus and cumulus clouds appearing shortly after that. And I’ll be able to make weather predictions for the next few days, although obviously not quite as accurate as the Met Office’s (or perhaps more so?).

It should be early April by the time that’s working. Looking at climate over a long period of time should be straightforward. I’ll just have to leave the laptop running all night under my bad where Twinkle can’t find it and started treading on the keyboard like she enjoys doing when I’m working on it.

Mid-April now. And then there’ll be the final tweak to get the CO2 and global warming working, maybe just by decreasing air conductivity a bit, or boosting solar insolation by the same fraction.

Finally, in early May, I’ll release the results of my GCM to the world in a series of increasngly panic-stricken press releases that will describe how all my mobile air masses have been jostling and bumping into each other to produce the entirely new and hitherto unforeseen phenomenon of Global Swelling, which is what happens when warm sticky air masses form a “traffic jam” around the equator, and pile up higher and higher on top of each other, before rolling downhill towards the poles again, very likely destroying entire civilizations in the process – unless governments Do Something About It Before It’s Too Late – like buy me a newer and faster laptop with WiFi and BlueTooth and all the other latest fancy trimmings.

Global climate models are extraordinarily useful tools, and among other things have successfully predicted the rise in temperature as greenhouse gases increased…

That’s hilarious. Lots of people predicted the same with calculations on the backs of envelopes. Are you saying that it was just serendipity that the GCM’s produced this astonishing result. Some others consider that the models were designed and tuned with the assumption that the observed warming of the late 20th century was caused by GHGs. Thus, you get the expected result.

My first question is whether I have fairly summarized both the modeling process of the IPCC, and the reasoning behind it’s finding of a causal effect between CO2 and warming;

You have.

This leads into the second question. Has anyone actually tried to construct a model that has little or no CO2 effect, and then spend as much effort tweaking it to best fit past and current climate, as has been spent tweaking climate models that DO assume substantial warming.

All attempts to produce numerical predictions of future climate are models. I think what you are asking is, are there Global Climate (Circulation) Models that have little or no CO2 effect?

The simple answer is we don’t know, because models are proprietary and only selected model runs are published.

However, with reference to your question 1, this means the IPCC’s case is built on an unverified and unscientific assumption. Namely, the published runs are representative of all possible runs.

What possibile utility is there in modeling a relationship that cannot be physically measured, either directly or indirectly, e.g. the temperature response of the Earth as a function of CO2 concentration? If you don’t have the ability to measure a physical system at a level of detail that you need to perform a task, why would someone think that this deficiency could be overcome by simply modeling the system at the desired level of detail and testing the model against the things you can measure?

The IPCC used to be upfront about the fact the models are wildly wrong on almost every climate metric except temperature. They then bizzarely concluded we can trust the model’s temperature predictions.

Anyone can see that the reason they get temperature right (when hindcasting known data) and all other values wrong, is the models are tuned to the temperature data.

Finally, given that a computer does nothing more than what the programmer instructed it to do, how can merely running the computer model tell you something that you did not already know or assume about the system modeled?

You are correct, a computer model doesn’t tell you anything you don’t already know or had assumed.

How the computer models work, is they iteratively process datasets with the output of one iteration becoming the input of the next iteration. Each iteration being a time period.

What computer models allow you to do in theory is to produce predictions of much greater accuracy.

There is no empirical evidence that climate models do in fact produce more accurate predictions than for example, drawing a line on sheet of graph paper or the computer equivalent.

The IPCC’s belief in the temperature predictions of the climate models is purely faith without any scientific basis.

“During the 1980s and 1990s, the earth warmed at a faster rate than it did earlier in the century. ”

This is not correct. In Patrick Michaels newest book he states : Page 12

“The IPCC history shows two distinct periods of warming, on roughly from 1910 through 1945, and then another that begins rather abruptly in about 1975. Their warming rates are statistically indistinguishable. In the past three decades ending in2005 , the warming rate was 0.178 Deg C +/- 0.021 per decade. In the period 1916-45 the rate was 0.151 Deg C +/- 0.014 per decade.”

I have an old (only goes to 2001) Hadcrut version from the Hadley Site. The results are

This is a pet peeve of mine. No one has been able to explain why the early and late 20th century warmings are essentially the same. Until someone does, colour me skeptical. (Michaels thinks it is the sun, but I’m pretty securely in the Leif Salvgaard camp). You will have to get the book to see why Micheals thinks the way he does.

How can someone simultaneously say that natural variability is strong enough to supress the climate signal from GHGs for possibly the next 30 years, while also maintain that the climate signal from GHGs has been observably distinguished over natural climate variability in our temperature record. In the mid to late 70s, the earth was winding up a cooling trend, and the case for global warming was based primarily upon the large temperature increases in the 30 years since then. If, despite the increase in annual CO2 emissions, global warning is not yet strong enough to counteract natural climate cooling in the 30 years going forward, then how can anyone look at the past 30 years and conclude that any particular amount of the warming trend of the last 30 years was or was not natural? Has somebody proven that the claimte naturally cools more strongly than it warms?

In my mind, this is an admission that there is no proof of manmade global warming that can be taken from the Earth’s temperature record.

Fortran 77? Wasn’t that Fortran 4 I used to tutor? (Friends, don’t get old.)

Local SW Missouri ornithologists raised the alarm last month about the unmistakable thermal impact on some silly little purple birds which had been forced northward out of their usual winter haven here by relentless global warming. Rather than spontaneously combust, these crafty avians headed for cooler clime, a whole forty miles north of here. Please ignore the fact that these birds can migrate over three hundred miles a day, by recent measurement. A disreputable skeptic (oh, wait, that was me) pointed out that the local farmers had plowed their pastures and planted corn in order to cash in on the “alternative fuel” bonanza which was surely to happen. Farms north of here are likely too rocky and steep to raise much good corn. Since little purple birds can’t crack a kernel of corn, then better vittles are probably where corn fields ain’t.

Backed by the latest satellite images, dire scientific pronouncements, and scary climate models, the little purple birds headed north. Right into a winter storm from our frozen friends in Canada. The little purple birds froze to death. More victims of global warming. Shame on capitalism. Bird-brains. Film at eleven. (/sarc off)

“Finally, given that a computer does nothing more than what the programmer instructed it to do, how can merely running the computer model tell you something that you did not already know or assume about the system modeled? “

We flogged away at this question a few days ago on another thread (see WUWT 2/19/09, https://wattsupwiththat.com/2009/02/19/when-you-can’t-believe-the-model/ ) with two camps emerging: One, computers cannot do any more than their human programmers tell them; and Two, computers can determine things no human ever could do. I am firmly in both camps, having written and tested both sorts.

The theory and practice of Artificial Intelligence holds that there exists several classes of problems in Camp Two for which there are no known solutions, some are so large that even a computer the size of a galaxy would require millions of years to solve.

A more tractable example of Camp One is where no direct solution exists to a given problem, hence trial and error must be used. Further, the number of iterations can continue for a very long time with successive results improving only in the very small decimal places. Therefore, some sufficiently small difference is allowed between iterations, and the answer is declared at that point. Some problems have sufficiently numerous variables, and complex calculations that they can only be solved iteratively. Humans can do this, but it takes a very long time and is prone to calculation errors

A simple analogy from engineering is the heat loss from the surface of an insulated pipe carrying a hot fluid, calculated by determining the temperature gradient through the insulation and the wall of the hot pipe. One can establish the internal temperature of the pipe, for example, carrying hot steam at 900 F. The pipe wall thickness is known, and its thermal conductivity is also known based on the type of material. One also knows the temperature of the surrounding air, for example it may be 40 F. The thickness of insulation is known, and its thermal conductivity is known.

To solve this, one makes an initial guess for the temperature at the outer edge of the insulation, then solves the various heat-balance equations, including radiation heat losses. A calculated temperature at the outer edge of the insulation results, usually not the same as the initial estimate. One continues to input the new temperature, re-solve, and check to see if the old temperature is sufficiently close to the new temperature.

Now, extrapolate this simple problem into an atmospheric model, where the atmospheric volume is divided into a grid of thousands of cells, each cell having three dimensions. The heat transferred from one cell to another, in all three dimensions, must match the heat received from other cells.

Temperatures are a measure of heat, so the model (had better) solve for temperature. Further, the properties of the air in a cell change with temperature, as the volume increases as temperature increases. This creates a mass-transfer that must be properly calculated and solved iteratively. Further, there are moisture issues, or humidity, with rising vapor from oceans entering the appropriate cells.

There is heat input from the sun, which varies not only by the time of day, but with the seasons as the earth orbits and the axial tilt changes with respect to the sun. The day-night changes in heat input also are important. Clouds act both to reflect daytime sunlight, and as a thermal blanket at night.

The iterations must continue for some time because we are interested in very small changes over very long time periods, so solving to 1 degree C probably is not going to pass muster.

Other considerations are water vapor condensation as rain releases heat in upper cells, and rain falling to the earth that may absorb CO2, or it may evaporate somewhat as the drops fall through less humid air, or it may fall to earth. Also, the water droplets may freeze and fall as sleet or hail, or they may initially form as snow. Albedo effects from snow and ice are of course important.

One can appreciate that this gets fairly complicated, thus the need for a computer to solve all the equations. It may be possible that humans could solve the equations, but the organization and time required would be immense.

From my readings on the subject, not all of what I just described is modeled, and there are other important things I have not included. One such thing is CO2 levels increasing with time. Also, methane levels increasing. Also, dust or smoke from desert winds or cooking fires. Also, aerosols from both natural sources, such as volcanoes, and man-made sources. One effect from warming is that frozen methane hydrates may thaw and enter the atmosphere. The recent stories on methane bubbles in the Arctic are on point.

I don’t know if this helps any, but it is how I view these models, and remind myself that these modeling guys may be making heroic efforts and good progress relative to where they were 10 years ago, but they are really kidding themselves, and especially elected officials, with any predictions of the future.

All fine responses. I especially enjoyed Henry Phipps’ commentary on the poor little purple birds. Poor buggers. Just like the spotted owls, they only needed to spread their wings, and fly to happier hunting grounds.

“That’s the thing. These models were written from the ground up to simulate climate response to CO2. It’s what they do; why they exist. That anyone is surprised when the answer to the question is CO2 still amazes me.”

I’m a patent attorney, so not only do I have an engineering degree, but I review a lot of client material when preparing a patent application. Usually, when someone constructs a model, mathematical or physical, that points to factor X as being a prime cause of effect Y in the model, they also try to quantify how robust that result is if values for other unknown variables in the model are changed. This at least gives you a subjective handle on how well you can count on the modeled results.

Part of me is just as cynical as your response exhibits – primarily because I’ve read through passages of the IPCC reports and see a lot of deceptive wording and hollow conclusions. Nonetheless, I have a day job and don’t have the time (probably not the mathematical expertise) to thoroughly review the published material on the mathematical models of the global warming crowd. I can’t say for certain that the creators of the models have not in fact spent considerable time doing assorted model runs using widely varying values for unknown parameters, including climate sensitivity to CO2, so as to test the robustness of the result that known climate cannot be simulated without a contribution from CO2. Certainly, this would be both a feasible and sensible thing to do.

Having said that, if, as Phillip B stated, only selected model runs are published, and no one documents the different values used for model parameters during the runs, I have a hard time viewing these models as part of any scientific endeavor – science is all about procedure; documenting what steps were made so that others can reproduce the experiments and see whether consistent results are obtained. In fact, the thoroughness of the procedures used to test the falsity of a hypothesis is what eventually gives credence to the hypothesis, if it survives those tests (think relativity and evolution). If no one can devise an effective test of a hypothesis, or alternatiely if no one discloses the complete details of the procedures used to test the hypothesis, how can you possibly evaluate the accuracy of that hypothesis?

1. Had 2007 been the hottest year on record as the Met Office and Hansen predicted, it would have been cited as proof that the models were correct. Instead, 2007 saw temperatures drop by more than 0.5C and was discounted as “natural variability.”

I hve no problem with the assertion that computers are practically necessary to solve many real-world iterative solutions. What I question is whether the solution you get from the computer can teach you more about the underlying relationships used to construct the model. To take your thermal pipe example, would completing that iterative process, either manually or using a computer, ever teach you more about the laws of convection, conduction, or radiation? Could the results be used to better tell you what the thermal conductivity of the pipe wall is, if it were not possible to meause that thermal conductivity in the real world?

One of the fudamental issues I have with global warming theory is a belief that the lack of the ability to measure most of the physical relationships that govern our climate system will always cripple our understanding of the climate. You never let theory outstrip observations. That’s what happened in the 40s and 50s when we modeled the atom like a miniature solar system with electrons orbiting the nucleus like little planets – it may have been a reasonable hypothesis, but when our obserational abilities caught up with our imaginations, we found out that electrons didn’t move like that. The ptolomaic version of the solar system, with everything orbiting the earth, wasn’t put to rest until our observartional abilities improved to the point of being able to detect stars precessing back and forth over the year as the earth moved from one side of the sun to the other.

The utility of modeling a system in more detail than our ability to observe that system, in my mind, is inherently suspect.

From my understanding all the models employ a “forcing” which is a positive number that says how much T changes when [CO2] changes. The main function of individual models is just how minor perturbations about that trend appear. The prediction of rising T is simply an expression of the assumption that “forcing” is positive, not zero. So, yes, the ouput is just the assumption in disguise.

30 years was chosen as ‘climate” at the end of a 30 year warming period. So if the temperature then decreases at about the same rate for, say, 15 years, it is still a warming trend. Think employment security.

1) CO2 sensitivity is not an input parameter in the models. The CO2 sensitivity is determined from the RESULTS of the models.

2) The GISS model itself, along with the inputs and results of their model runs, are freely available at the GISS website. (I don’t know about how open other climate models are.)

3) The climate models were not “written from the ground up to simulate climate response to CO2”; they were written to simulate the climate change from ANY forcing- solar, albedo, aerosols, ozone, etc…, not just CO2.

4) Many models have been run both with and without anthropogenic CO2 (there’s a section about it in the IPCC report). Needless to say, the two scenarios yield very different results.

kurt — Part of me is just as cynical as your response exhibits – primarily because I’ve read through passages of the IPCC reports and see a lot of deceptive wording and hollow conclusions.

Cynical? No.

Unless you start off by trying to figure out how GHG’s are working then there’s no reason to posit GHG amplifiers.

If you consider that the language of the model outputs is always in the form of “forcings” (amplifiers) you could conclude that the purpose of the model isn’t to ascertain an admixture of varying parameters at all, but to determine how GHG’s are working. There’s no need for an invocation of forcings otherwise, and RC wouldn’t be posting that N% increase of CO2 doesn’t result in the theoretical increase, but more (due to these forcings.)

Henry Phipps’. I have a question. Were the birds “purple” before or after they flew into the cold north?

Reed, the birds were purple, the science is settled, and we have no time for debate about this.

What IS distressing is that parts of the ornithologists who observed the birds in the blizzard have also turned persistently purple.Looks to be a rather tame Christmas party at the Department of Ornithology this next holiday season.

I came on this thread late, and few may see this, but I’m compelled to add a view on these math/physics models. Over more than 30 years, I wrote business and financial models using linear, reiterative math. Late in my systems career, I was asked to devise a linear math model to emulate a chaotic system. I declined. I can’t. I know no one who can mathematically emulate a chaotic system and produce a reliable result. I know OF no one who do that.
The atmosphere is chaotic. The oceans are chaotic. The gaseous interface between them is chaotic. Chaos cubed will never be emulated with any accuracy by even hyper-reiterative linear math, even if the modeler could reasonably control the iteration time frames. Nothing more than a targeted estimate range could emerge from even the best linear math models.
If an AGW advocate can tell us of a genius modeler using chaotic math systems for climate projections, I’ll grovel and apologize. Until then, I regard all of the 21-22 IPCC models as tools producing exactly what they were designed to produce: propaganda data.

“Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming.”

It is called :” moving the goal posts” . Or,” the pot of gold is always at the end of the rainbow”.

Several prominent figures have said that ancient religious memes are replaying themselves in a modern context. Al Gore is a modern prophet (and as usual the prophecies don’t come true), global warming is a Hell for punishing carbon sinners while global cooling/Greening is Heaven and a return to Eden, carbon credits are Indulgences, scientists in their lab coats are today’s robed priests, in the media figures like George Monbiot are the equivalent of raving mad preachers like Savanorola, and the UN is the new Vatican for the faithful who want their creed to be universal.

Nostradamus used to read the future from looking into a bowl of water. None of this prophecies have come true (he even gave one a specific date in 1999) but half a millennium on he still has believers.

That’s what computer models are like to AGW faithful – necromancy.

In the US it should be considered unconstitutional for government to be involved with a New Age religion.

“Could the results be used to better tell you what the thermal conductivity of the pipe wall is, if it were not possible to meause that thermal conductivity in the real world?”

Actually, I believe the answer to that is yes, but only if one is able to measure the outer wall temperature, that is, the edge of the insulation exposed to the atmosphere. If one is unable to measure that outer wall temperature, the problem becomes indeterminant. There would be too many unknown variables, and not enough equations.

Part of the GCMs’ problem is no one can truly test or verify the models, as we cannot easily add 100 ppm of CO2 to the atmosphere without waiting for a good many years, and we cannot cut 50 ppm CO2 from the atmosphere very easily, if at all. These changes in CO2 are the step-tests that should be conducted to determine the physical responses to a change in a control system, and thereby verify the models. Therefore, every modeler is guessing, based on some estimates of how laboratory science scales up in the large atmosphere. I refer, of course, to CO2 absorption of certain wavelengths of infra-red radiation from the earth’s surface, and subsequent re-radiation.

Similarly, it is difficult if not impossible to perform step-tests for albedo changes (zero polar ice), cloud cover (zero cloud cover and 50 percent of the earth), volcanic gaseous and particulate emissions (like babies, they keep their own schedules), and so on.

The skeptics have devastating claims as, for example, Steven Goddard among others, with the evidence that none of the GCMs predicted the current cooling trend that is reported by the four major measurements. Such cooling should be impossible due to increasing levels of CO2, if the GCMs are correct. Cooling is predicted from major volcanoes, and decreases in CO2, neither of which occurred.

This fact alone should be hammered home to the elected officials who are drafting climate change legislation.

When the physics elementary particle community comes up with a number, for example, ” there are 3 neutrinos” from the fit to the data, it uses computer programming and modeling intensively. The difference with “there is a 3C rise in 100 years” coming from the GCM models is the error bars.

Errors are strictly propagated in all particle physics modeling both statistical and systematic, and all numbers come out with an error +/- statistical , +/-systematic.

No error propagation exists in the IPCC model outputs that I can find in the voluminous reports. This denotes that any numbers given are meaningless, sleight of hand, video game presentations.

What is done instead, to fool the scientific audience:

1) The model is fitted to the temperature data. There are so many parameters that this is not difficult to do ( Von Neumann and his elephant).

2) In order to simulate the chaotic dynamics of climate ( they do acknowledge that climate is chaotic) they change the initial conditions as they feel like and create spaghetti lines around the optimum fit, believing they simulate chaos. The deviations from the fit they treat as errors. BUT the variations are not 1 sigma variations to give chi**2 per degree of freedom, they are just to please the eye of the beholder, because no such chi**2 is reported anywhere.
For example, a 1 sigma variation of the albedo only would throw off the curves by 1C ( try the toy model over at junkscience.com).

3) Then many models, which means different models of similar structure and assumptions are all put together on a spaghetti graph, again claiming the width as representative errors even though the only “valid” argument is “chaotic spread”, including the chaotic brain waves of the modelers. What amazes me is that statisticians are treating these spaghetti projection widths as if they are true errors and discuss the number of angels on the point of the needle, ( in other blogs).

A second blow against the GCMs is what a lot of people with mathematical physics background have been trying to say, but cannot get through. The grid model of the earth cannot simulate the interdependencies of the numerous coupled non linear differential equations that enter the climate problem ( a classical dynamic chaos problem) . By construction the grid with the average values assumed in these huge blocks is assuming that the solutions to these differential equations are well behaved so that the first order terms can be used for most of the variables entering ( the average is the first order term in an expansion of any function). It is more than inevitable that this hypothesis will fail, because the solutions are highly non linear. That is why for weather and climate GCMs can only work for a limited number of time steps. After a while, the divergence of the real solution enters as the higher order terms kick in.

The only useful modeling for the future has to be a model that incorporates these nonlinearities, as Tsonis et al have attempted, non linearly.

All I can say is AMEN and as an engineer who has done incredibly simple mechanical models covering minutes of simulated time, observing the errors involved in even the simplest and shortest of runs, it strikes me as incredibly amazing and arrogant that climate modellers think they have an accurate climate model for 100 years out. The models can help you do many useful things but it is not a crystal ball; it is not soothe-saying proxy.

It seems most commentators on this site have the mistaken impression that a model is only useful if it can provide accurate predictions. However, they are missing an important aspect of the modeling endeavor. In many cases the purpose of a model is not so much to make accurate predictions as to provide a means to test the basic assumptions upon which the model is based. In short, a model provides a means to relate one set of observables to another set of observables and to understand these relationships. The model also provides a means to test the limits of these relationships (or assumptions) and to determine when they break down.

In this sense much of the climate modeling work has been hugely successful in that scientists have been able to rule out some unimportant relationships and have greatly expanded our understanding of both climate and weather. They have also been able to test various scenarios and make predictions. Of course if some of the underlying assumptions are incorrect their predictions will be incorrect, but this is still a valuable exercise in that it provides a way to evaluate their assumptions and later analyze which of those assumptions broke down.

Unfortunately, most people don’t grasp the fundamental value of the modeling exercise in expanding our understanding of a phenomenon. Because of this, there is tremendous pressure on modelers to produce predictive models. Unfortunately, this can often lead the less careful modelers to overestimate and overstate the abilities of their models and to overlook or even hide known flaws. Furthermore, the situation becomes exacerbated when the media or government officials take modeling results out of context and try to formulate policy based on a hypothetical prediction.

The bottom line is that models are extremely valuable tools even when they are not predictive. However, it is critical to understand their limits and to be very cautious when applying a model’s results outside of these limits.

I think many climate scientists are doing a great job in developing their models. However, it seems to me that some experts in the field are a bit too confident, especially when we all know there are severe limitations for modeling so many incredibly important phenomena, such as cloud formation, humidity, aerosols, etc…

Hi. First off i have to admit that i havent reat all of the comments to your article so if someone else has already stated this, well then i obviously agree with them!!

‘Climate Change’ has been the international buzz word for what feels like a long time. as far back as i can remember most of my geography lessons were taken up by the ‘man made’ phenomenons of acid rain, floods caused by the divesersion of rivers, massive storms destroying thousands upon thousands acres of land and recently global warming. It seems to me that a majority of this is reported on a popularity basis. Since the world economy went into melt down I have only seen one report on global warming and that was simpl in responce to the presence of a few centimeters of snow falling on a pitifully unprepared England. The headlines themselves are in my opinion little more than designed to shock and acre people. proclaming in big scary letters precisly how many people will drown if (and apparantly when) sea levels rise.

These climate models that are spewing out all these predictions about the doom of the earth at the hands of an over industrialised population should not be taken as an absolute truth as your article points out. The technology is not nearly accurate enough and while it is no doubt very complicated, it seems to still be at least a few years away from being usefull to the required extent. Not that i am discounting it completly! I just believe that like all sciences (inluding my own branch of science, Ethology) it needs to be trialed, analysed, reviewed, altered, trialed, analysed, reviewed and so on. I think this technology can be perfected, i just think that is a little way off yet.

My point in brief, just because the weater map says it should be raining on your house and when you look there is indeed rain falling from the sky, dosent mean the weather map is acurate for everyone.

“The only way that the different models (with respect to their sensitivity to changes in greenhouse gasses) all can reproduce the 20th century temperature record is by assuming different 20th century data series for the unknown factors. In essence, the unknown factors in the 20th century used to drive the IPCC climate simulations were chosen to fit the observed temperature trend. This is a classical example of curve fitting or tuning. ”

Roger Sowell (18:58:49) :
The theory and practice of Artificial Intelligence holds that there exists several classes of problems in Camp Two for which there are no known solutions, some are so large that even a computer the size of a galaxy would require millions of years to solve.

Reminds me of Hitch hikers guide to the Galaxy. The question posed was “What is the meaning of life” :)

SemiChemE — Unfortunately, most people don’t grasp the fundamental value of the modeling exercise in expanding our understanding of a phenomenon. Because of this, there is tremendous pressure on modelers to produce predictive models.

Of course people understand this. On the other hand, a model created to understand air flow characteristics over wing shapes isn’t very useful unless the underlying intent is related to better wings. Maybe you’re designing a new wing. Maybe you supply aircraft paint and you want to see if the paint surface messes with airflow at different speeds. And so on. Models are created from the bottom up with the express purpose of understanding and prediction (implied or otherwise.) It is therefore not unreasonable to suggest that a working model ought to be predictive.

Furthermore, the situation becomes exacerbated when the media or government officials take modeling results out of context and try to formulate policy based on a hypothetical prediction.

This begs the question of how politicians etc become aware of models in the first place. Politicians become aware of these models only when they are used as a presentation to make a prediction which impacts policy.

If you have a computer model of X I have no idea it exists. But one day your boss goes to congress and says there’s a public problem (Z) and we know this because of our model of X. Now I know the model exists, and since the model was used to claim (Z) I think it’s not unreasonable to suggest that the working model ought to be predictive.

You’re correct to say that models are more complex than they are sometimes presented to be. Gore simplified the message to highlight the importance of carbon emissions, but the truth is that increasing anthropogenic CO2 and natural cyclicity of insolation each affect year-to-year variations in climate and the weather we experience.

That’s pretty simple to understand. The addition of a steadily rising curve with a lower amplitude sinusoidal signal produces a stepped but broadly rising profile — in other words, a general warming trend with broadly predictable (but lesser scale) oscillations around that.

The truth is that the cold weather which you highlight so assiduously and correctly on this site is entirely on schedule and as predicted by any scientific model which takes account of those two processes.

I’d warmly congratulate you for reminding us that insolation effects must also be considered in climate modelling, but I’d advise strongly against any kind of hubris that a few recent and impressive spells of cold weather contradict a general warming trend. Because they really don’t.

I think many climate scientists are doing a great job in developing their models. However, it seems to me that some experts in the field are a bit too confident, especially when we all know there are severe limitations for modeling so many incredibly important phenomena, such as cloud formation, humidity, aerosols, etc….

I think you are being too kind in a situation that requires as much ruthlessness as that shown by the politicized AGW crowd as regards the repercussions of denying energy to billions of people mainly in the third world.

Climate scientists who do not speak up at the misuse of modeling are equally guilty as the ones that are pushing modeling as modern day prophecy, so why treat them with gloves? It is not the models that are at fault, but the use made of the outputs.

The total joke is, that the climate scientist do not only try to model the climate so they can predict a climate in the future! They even try to “prove” AGW with their models. They always say our models only simulate the strong warming in the last years when they include in their models a strong warming by CO2.

It’s easy to model. Just draw a line out 5 – 10 years that continues the current trend (and maybe nudge it up a little bit if the trend is flat or down) and then just sharply veer the trend upwards. Viola! A climate model that shows run-away warming.

roads — That’s pretty simple to understand. The addition of a steadily rising curve with a lower amplitude sinusoidal signal produces a stepped but broadly rising profile — in other words, a general warming trend with broadly predictable (but lesser scale) oscillations around that.

Common M.O. of the AGW adherent is to show a graph just like this. I don’t think you understand what skeptics are saying.

The graph doesn’t show anything more than correlation. Chances are you can plot temps against land use and population growth and get the same curve. But does this imply that people are raising the temps via spewing gases or are the temps a proxy for land use changes and the CO2 isn’t part of that equation?

Also, how much of that rise is natural vs the work of man? Much is made of the ability to detect the fossil fuel signal in that CO2 plot via analysis of isotopes. And yet this isn’t trumpeted far and wide. Why? because it’s 3%.

Remember that scientists in Copenhagen saying that global warming has gotten much worse since the last IPCC meeting, as indicated by flat sea level, declining temperatures, increasing Arctic ice, and the end of glacial retreat in Greenland.

The truth is that the cold weather which you highlight so assiduously and correctly on this site is entirely on schedule and as predicted by any scientific model which takes account of those two processes.

I’d warmly congratulate you for reminding us that insolation effects must also be considered in climate modelling, but I’d advise strongly against any kind of hubris that a few recent and impressive spells of cold weather contradict a general warming trend. Because they really don’t.

There is no one contributing to this blog that would refuse the warming since 1850 or so. Nobody. What people are refuting are the Anthropogenic as in AGW. Even if Gaia had not had a sense of joke, or irritation at the hubris of AGW, and had not sent a negative PDO, the A in the IPCC models is nonsense.

1)The trends in the tropical troposphere did not warm more than the surface during the years of warming.

2) CO2 lags temperature by 800 years long term and 6 to nine months short term, again even during the warming years.

3)The humidity did not play ball as the models predicted, even during the years of warming.

In addition during the years of stasis CO2 is merrily climbing and is unable to budge the PDO, let alone stop the next ice age as Hansen had the hubris to suggest a while ago.

So we do have a good PR from PDO, as AGW had one when the PDO was warm. So?

have a look at the real prophecy:

and look at the clockwork precision of ice ages, every 100.000 years recently. We are at a flat top that is lasting much longer than other transitions, which is oscillating within 1C from a stable temperature. This will not last. That is the true prophecy, that there will be another ice age. In a few years, a thousand or more, but that is the real danger for humanity where thought and resources should be focused. The flat top shows that we will be playing with tolerable temperatures, until the grand slide starts.

Steven, your post’s title, “If You Can’t Explain It, You Can’t Model It”, is illogical (blind modeling has led to many useful scientific insights) and you first sentence misidentifies “Global Circulation Models” as Global Climate Models. Your “one small example” of the failure of a climate modeling is, unsurprisingly, a rehash of weather as isolated “counter-proof” of climate change.

I’m heartened to see you start referring to climate modelers as “competent and ethical people” though. Perhaps you’re coming to recognize how hard the modelers are working on the sophistication of their models and their predictive value? It’s one thing to say that a particular model is lousy, another thing altogether to offer an improvement…

P.S. Please provide a link to the “fundamental rules of computer modeling” you reference in your post.

It is true that weather and climate patterns, like patterns of movement in complex economies, have so many variables that computer models to predict future trends do not work for long periods of time.

On the other hand, from what I have read, we have some strong indications that global warming is real. Sea level is slowly rising, and the melting of arctic ice is making life harder for polar bears. They may be in danger of imminent extinction. I have also read that whales are starting to starve to death because of the death of organisms at the bottom of their food chain.

Since I live in a coastal city (Long Beach, California), I am particularly concerned about the rise in mean sea level. From what I have read, glaciers in Switzerland and the Andes are receding, threatening hydroelectric-generation systems in Switzerland and fresh-water supplies in South America. (Changing patterns of rainfall are also threatening fresh-water supplies in Southern California.) And the rise in sea level also is threatening to flood small, low-lying island nations in the Pacific.

While we know that mean sea level is rising, we don’t know whether it will rise by one foot by 2100 or two or three yards (this apparently depends on the how fast the glaciers of Greenland and Antarctica melt).

Any rise at all threatens the continued viability of New Orleans, for example, parts of which lie below current sea level. Right now, it seems to me that it is foolish to try to rebuild New Orleans because it is highly likely that the city will be flooded before the beginning of the next century.

Scientists at the university used a math application known as synchronized chaos and applied it to climate data taken over the past 100 years.Now the question is how has warming slowed and how much influence does human activity have?

“But if we don’t understand what is natural, I don’t think we can say much about what the humans are doing. So our interest is to understand — first the natural variability of climate — and then take it from there.

Scientists at the university used a math application known as synchronized chaos and applied it to climate data taken over the past 100 years….Now the question is how has warming slowed and how much influence does human activity have?

“But if we don’t understand what is natural, I don’t think we can say much about what the humans are doing. So our interest is to understand — first the natural variability of climate — and then take it from there.

Excellent post. I spent something like 20 years ( before I retired) trying to model demand for penicillin based antibiotics with similar lack of success except where it was possible to apply the central limits theorem when I could get quite close to the next years requirements.

It seems to me that climate modelling is full of difficulties such as not having a clear set of independent parameters that define what the climate’s state is at any one time and what represents an appropriate time bucket to model what is a continuous process that is probably chaotic. It seems that climate modellers place great score on temperature measurements which, of course, is not independent variable but dependent on many other parameters which themselves are not independent and relationships that are purely empirical. That’s not to say that empirical relationships cannot be useful but they usually, in real science, represent a starting point for study not he conclusion. Correlation is not causality! Without establishing causality climate models are not worth the paper they are written on nor the number of trees used in preparing them!

That’s the thing. These models were written from the ground up to simulate climate response to CO2. It’s what they do; why they exist. That anyone is surprised when the answer to the question is CO2 still amazes me.

Exactly! I have often posited that these models produce warming because that’s exactly what they are programmed to do. The models don’t test the AGW hypothesis, they just illustrate how it might work, given various scenarios.

“In terms of computing power, it’s proving totally inadequate. With climate models we know how to make them much better to provide much more information at the local level… we know how to do that, but we don’t have the computing power to deliver it.“

From this, it sounds like they’re saying that they understand more than they can calculate. That is, the explanatory grasp exceeds the computational reach. This is the opposite of the point you make, no?

It is of course true (not to mention trivial) that climate models have improved (and are improving with continuing observation) in providing a complete accounting of all forcings and feedbacks, anthropogenic and otherwise. We’ve known for a long time that there’s a natural anthropogenic positive forcing, as predicted by basic physics. That isn’t in doubt at all, and the researchers you yourself cite on the “30 year pause” story have said that continuing to accelerate the anthropogenic forcing will finally result in “aggressive” and “explosive” warming. To repeat, it’s one of the experts you’ve cited that has chosen the alarmist term “explosive warming”.

Here laid plain is your gambit: Cartesian skepticism. Your standard is this: if there is any doubt about anything then we know nothing. However, you fail or refuse to acknowledge that uncertainty cuts both ways. A good illustration of this is recent research indicating that Greenland and Antarctic ice sheets may increase sea level rise above the more conservative IPCC predictions. The IPCC refused to make alarmist assumptions about rates of melting, but some recent observations have been alarming.

So, there is uncertainty, but also some important knowledge. Furthermore, uncertainty goes both ways. Maybe ocean currents cause a periodic negative forcing that will pause things before explosive warming resumes, as the research you cite indicates. Or maybe a positive feedback like methane release will be worse than anticipated. Maybe the ice sheets will melt slower, maybe faster. I am not willing to subject billions of human beings to a dangerous experiment just to find out if things will be not-quite-as-bad, about-exactly, or somewhat-worse than the best predictions of the world’s top researchers.

No, it’s not. The additional flux of anthropogenic CO2 is 3% per year. That’s not at all the same thing.

Think of an account paying 3% interest. After 1 year, the proportion of interest to capital in the account is close to 3%. After 10 years, the additional flux of interest to capital is still 3%, but is the proportion of interest to capital in the account still 3%?

Yes, that is true — however it’s critical to note that the timescales of the changes in CO2 concentration implicit in the geological record and those which have been recorded since 1850 are of a completely different magnitude.

Under ‘normal’ (pre-anthropogenic) conditions, there is a significant lag between CO2 and temperature. The balance between organic productivity and changing temperature equilibrates over considerable time, all other things being equal.

As for your arguments about ‘the years of stasis’ — I’m not quite clear what you mean here, since CO2 has been climbing more or less continuously. Temperature is subject to superimposed and lower magnitude insolation effects, which can give the appearance of temporary stasis or even a modest and temporary fall in the global temperature curve, but the overall trend is upwards.

All other things are not presently equal. Anthropogenic carbon emissions are increasing by 3% per year. CO2 concentrations are rising by around 2 ppm per year. CO2 concentrations have risen from c.280 to c.390 ppm precisely because the additional flux is greater than the amount which can be accommodated by natural sinks.

If CO2 were falling consistently over time, we might predict a new ice age in a few tens of thousands of years. But on the timescale we are considering, it isn’t — quite the converse is currently true, as above — and so we most certainly can’t.

Obviously you can’t model a physical process which you can’t describe mathematically. Have you ever looked at a GCM to see how they work? Each physical process is broken down into a mathematical model of it’s behaviour, based on physics. Another fundamental rule is that magic is not permitted in modeling.

Perhaps someone needs to publish the fundamental rules of common sense.

If CO2 were falling consistently over time, we might predict a new ice age in a few tens of thousands of years. But on the timescale we are considering, it isn’t — quite the converse is currently true, as above — and so we most certainly can’t.:

CO2 does not change the temperature. The temperatures change CO2 with a lag of 800 years, no matter what Gore told you in that movie.

If CO2 falls bellow 150ppm , most plant life will wilt and disappear, so maybe Gaia let us dig up the coal and burn it so that the CO2 that was trapped there could be released and the green stuff can keep flourishing. Green houses, the real ones, flourish with 1000ppm CO2. The alveoli in our lungs work with something like 8000 ppm, saying something about when lungs evolved: when CO2 was much more abundant. ( unless you do not believe in evolution).

In contrast to this, all estimates of CO2 doubling give a small rise in temperature, except the IPCC scaremongering with fictitious water vapor feebacks, that have been disproved by data. A 1C warmer temperature per century is within the flat top of the warm time we are enjoying before the next ice age, and it will be mostly at night time, so whats the problem with having -3C instead of -4C in winter nighttime?

Nothing will melt because of this small rise in night temperatures more than it has been melting ever since the little ice age..

I see that the confusion between climate and weather is reflected at the NOAA site, where the National Weather Service has a Climate Prediction Center that offers a 6 Day Outlook.
It is repeatedly said that the climate is an average of the weather observations, so, presumably, a climate prediction is an average of weather predictions.
It’s not surprising that climate models are actually weather models.
In my view, climate is not described by an average of the weather but by a generalisation about the weather. It’s the same sort of distinction that eluded the very enthusiastic proponents of Artificial Intelligence and natural language generation.

My L.I.R. (Large Imaginary Rabbit) model shows, after thousands of runs, that on average a 10′ tall pink rabbit will destroy 10 acres of forest per year given the average deposit of 200 colored eggs per acre per year. I can project this with 95% confidence.

Green houses, the real ones, flourish with 1000ppm CO2. The alveoli in our lungs work with something like 8000 ppm, saying something about when lungs evolved: when CO2 was much more abundant. ( unless you do not believe in evolution).

Dear Anna,

Yes, you’re right; and we blow out 53000 ppmV of CO2 in a single breath.

If the most important phenomena are well understood and included in the model then the model can be predictive, as in your “wing” example. However, if many of the phenomena are poorly understood or are known to only pertain to a restricted regime of the phase space, the model is still useful even though it may not be predictive. Often In such cases, the model will be predictive for some portion of the phase space, but not for all possible conditions. So, by studying where the model fails it is possible to understand the limits of the assumptions and to understand how those assumptions break down. It is this understanding of when basic assumptions break down and how much error this introduces that is the underlying goal of science.

As for your comment:

“This begs the question of how politicians etc become aware of models in the first place. Politicians become aware of these models only when they are used as a presentation to make a prediction which impacts policy.”

This might be true if science took place in a vacuum. However, it is usually desirable, especially in such a multi-disciplinary field as climate science, for scientists developing models to communicate. This means presenting models, even incomplete models, at conferences and publishing results and techniques in journals. The problem comes, when policymakers who are providing funding expect to receive predictive models before the science is sufficiently mature to provide them.

In my experience (modeling, but not climate modeling) usually, what ends up happening is that someone (a government official, somebody’s boss or supervisor, etc…) starts asking for predictions based on a model’s results. At first the scientists are pretty good about including caveats about the model’s validity and the accuracy (or lack thereof) of the prediction. However once the predictions are made they take on a life of their own and get used for a variety of purposes both for which they were not intended and for which their validity is extremely uncertain. Often the scientists own disclaimers are ignored by the decision makers and others who propagate his work. At this point the cat is out of the bag and there is little the scientist can do to prevent the misuse of his results.

If you can apply them to very well known causal patterns and capacities repeatedly, so that they can be tested again and again against real-world outcomes, you can reach a state where they are accurate tests of real-world outcomes. (In other words they and their constituent elements have been subjected to repeated, controlled, empirical tests.) Engineering models are in that situation.

Roads (08:19:26) :
“…
Anthropogenic carbon emissions are increasing by 3% per year. CO2 concentrations are rising by around 2 ppm per year. CO2 concentrations have risen from c.280 to c.390 ppm …”

Perhaps I don’t understand how you arrived at the 3% increase/year. If the annual increase in CO2 ppm is 2, wouldn’t the increase as a per centage be (2/390*100=0.5%/year)? Much less than the claimed 3%.

Or, maybe alternatively, you are saying that – of the annual 2ppm – 3% is anthropogenic? If so, then 0.03*2=.06ppm (anthro) and the remanining 1.94 natural?

Power plants and cars exist because people’s livelihoods depend on them. Our ancestors didn’t like freezing in the cold and dark, and didn’t have time for playing mental games about whether or nor Gaia is happy about their decision not to freeze.

“In many cases the purpose of a model is not so much to make accurate predictions as to provide a means to test the basic assumptions upon which the model is based. In short, a model provides a means to relate one set of observables to another set of observables and to understand these relationships.”

How do you test the basic assumptions of the model without making some kind of prediction? It sounds like you are describing a process of predicting a value for one or more “observables” for the purpose of testing a presumed relationship between them. If you intend to say that the accuracy of that prediction is of no import in that even an inaccurate prediction marginally improves your understanding of the modeled system by telling you that something must be wrong, I’d agree, but with the caveat that this argument assumes that you can make sufficiently detailed measurements of the real system along with matching predictions of those measurments so as to determine which aspects of the model might be correct/incorrect. With existing climate models, I don’t believe that that is the case.

Steven. I think that matters are a bit more scientific than “guessing Gaia’s feelings”. Also, I think the alternative to changing the climate is moving into caves. Nevertheless, I understand that your retreat to hyperbole is necessitated by the untenability of your position and that a clever turn of phrase in front of an approving audience can be a source of self esteem. That is, I see why you’d say all that.

However and indeed, a simple notion of “experiment” will suffice to make my point. As has been pointed out, we cannot know with absolute certainty what continuing to increase the amount of CO2 in the atmosphere will mean for sea levels, to focus on one concern. It could be not as bad as the IPCC predicts or as recent research indicating accelerated melt rates may indicate it could be much worse. To do the experiment in this case is just to continue to dump increasing amounts of CO2 in the atmosphere. As curious as I am to find out for-absolute-certain whether [snip] who say sea level will stop rising or the conservative IPCC estimates or the more alarmist predictions are correct, I am not so curious as to risk great harm to my fellow humans.

“Could the results be used to better tell you what the thermal conductivity of the pipe wall is, if it were not possible to meause that thermal conductivity in the real world?”

Actually, I believe the answer to that is yes, but only if one is able to measure the outer wall temperature, that is, the edge of the insulation exposed to the atmosphere. If one is unable to measure that outer wall temperature, the problem becomes indeterminant. There would be too many unknown variables, and not enough equations.”

Sorry, I should have been more clear in the hypothetical. Indirect measurements count – if you are able to calclulate the thermal conductivity using an already-proven equation along with measured values for other system elements, surely that counts as an ability to measure the thermal conductivity of the real-world pipe.

Steven Goddard (08:32:41): “The EdGCM Project develops and distributes a research-quality global climate model (GCM)“ This is not what you referred to in your post, just an after the fact Google scrape. It was a GLOBAL CIRCULATION MODEL graphic that you showed in your post and linked to. It is the GLOBAL CIRCULATION MODELS that require massive computational horsepower.

“Obviously you can’t model a physical process which you can’t describe mathematically.” Your post talks about needing to EXPLAIN something before you can model it, now you are falling back to simply needing to be able to DESCRIBE it. This emergence of common sense deflates your whole assault, doesn’t it?

I’d still like to know what the “fundamental rules of computer modeling” you are upholding are. (“Not what those guys are doing” doesn’t count.)

. As curious as I am to find out for-absolute-certain whether [snip] who say sea level will stop rising or the conservative IPCC estimates or the more alarmist predictions are correct, I am not so curious as to risk great harm to my fellow humans.

But you are willing on the off chance of IPCC being correct to bring the western economy to its knees, the populations back to the middle ages, ( life expectancy 35yrs) and condemn billions in third world countries to starvation.

Well, I was using the hot pipe with insulation merely as an example of why we must at times use trial and error calculations. For boundary layer problems such as the cells and grids in GCMs, one presumably assigns a temperature to each wall of a cell (5 for ground-level, 6 for all others, 1 per face of a cube), and solves the various heat and mass transfer equations. It may be that the ground surface temperature, as measured, is input into the model as the bottom face of appropriate cells. Where calculated temperature values do not match those assumed, iterations are required.

This assumes that the equations behave such that each iteration converges to a temperature, and the temperatures do not instead diverge. I believe this is where the chaos ideas enter the problem, but I do not yet know enough about that to discuss it. But I will.

By the way, glad to see you on WUWT. I am a fellow attorney, chemical engineer, and may soon take the patent bar myself. My work involves climate change law.

A while back I watched an interesting youtube video http://www.youtube.com/watch?v=xHWTLA8WecI&NR=1 and this post reminded me of a thought I had as I looked at it. Perhaps it would be a more productive use of all that supercomputer computational power now used for running GCMs, seemingly for the sole purpose of reinforcing the prejudices of the programmers, to feed a continuous stream of all the new technical data we produce every day, along with any data we have previously archived, into these machines and have them run pattern recognition algorithms to identify possible cycles, periodicities, and or correlations. The output of this first level of analysis would be fed through further programs designed to isolate cycles that were merely coincident, from those that had possible cause and effect relations. After a cascade of increasingly sophisticated levels of analysis the output would hopefully help us focus on the most productive areas of inquiry. The goal of the system would be to create synthetically that which has always marked the best scientists of the past, but seems to be disappearing in the population of modern scientists, who endlessly proclaim to have the answers to all mankind’s problems. The best scientists of the past recognized, that as they pursued their questions, science would rarely, if ever provide them with answers, but if done properly it would allow them to ask better and better questions and the ones who are now recognized as geniuses shared one quality, the ability to ask the best questions.

Sorry to be quoting myself, but I accidentally posted this on the wrong thread.

Lots of good points in that comment, and one I want to respond to is the idea of having a theory, and a model for that theory, without corresponding observational data.

The example I have in mind is Einstein’s theory of relativity. Specifically, the portion that holds that time dilates as velocity approaches the speed of light. There was no way we knew of to obtain data to test that, until the Apollo program where spacecraft reached high speeds just before re-entry to Earth’s atmosphere. As I remember reading at the time, NASA placed one of two identical, synchronized, and highly accurate clocks in the spacecraft, with the other on the ground. The difference in time readings after the spacecraft landed was exactly what Einstein’s theory predicted, for the velocities achieved.

I am certainly no expert in theoretical physics, but from my readings I am led to believe that physicists made good use of Einstein’s theory before the time dilation tests in the late 60’s.

Another, that you mentioned, is the Bohr theory of the atom. Chemists did good work with that theory, even though the later theories of shells (s, p, d, and f, if I recall correctly) gave better agreement with experimental data.

The opposite is true, of course, as you explained regarding the helio-centric theory of the solar system.

I am not a scientist nor a meteorologist by any stretch of the word. I claim no expertise on these matters beyond common sense and a deep love for my planet.

From what I have read on this blog (and I could be wrong here) it seems to this layman that Watts and his guest writers are attempting to explain away global warming as a bunch of liberal huey.

Perhaps this brouhaha is exagerrated, since when did the media in all it’s spectrums (right, left, conservative, liberal, counter-culture etc.) ever turn down an opportunity to sensationalize a subject? This being said however, I find it hard to accept that there will be no reprecussions for our oil addiction.

I have read both sides of the debate, from natural variant models to human green-house emissions and while not claiming a text-book knowledge of the complex algorithms, feel I have an educated layperson’s understanding.

Either way one looks at the issue, the carbon emissions we pump into our atmosphere, along with the pollution from chemical plants, over-fishing of the oceans and deforestation of old growth forests cannot be beneficial to our quality of life or the balance of the ecosystem.

I admire that this blog is seeking to present the other side of the debate, but I cannot believe that our destructive habits will have no effect upon our Blessed Planet.

“From what I have read on this blog (and I could be wrong here) it seems to this layman that Watts and his guest writers are attempting to explain away global warming as a bunch of liberal huey.”

You are wrong here-it is not liberal huey we object to but unscientific liberal huey: There are now so many of us living on the planet that we will cause damage to it at some point-so if you can hatch a clever plan to get rid of around 50% of the population there might be a chance to return it to the pristine state you seek. The carbon emissions-which to this day have never been PROVEN to cause the dramatic warming claimed-are the least of our problems.

It is wonderful that you are concerned about the environment, but what does overfishing have to do with GCM’s? Do you think a cap and trade scheme will reduce pollution from chemical plants?

By lumping unrelated environmental issues together in one pot with CO2, unscrupulous people are predating on concerned citizens like yourself. Suppose the protesters had of been successful in shutting down the Capitol Power Plant. How much immediate human misery would that have caused?

“…but I cannot believe that our destructive habits will have no effect upon our Blessed Planet.”

I agree that the planet is blessed, but who has it been blessed by? I believe that the planet has been blessed by the appearance of man. Just as we are blessed to have this home. Not one climate realist wants to harm our planet.

Mother Earth loves CO2. The paltry 400PPM or so that exists now is a meager amount for a biosphere that thrives on 1,000PPM and more. Open the floodgates of CO2 and watch our planet return to all her green glory!

Don’t let the politicians take us into a backwards dying earth. Move forward.

“Selfishness is the bedrock on which all moral behavior starts and it can be immoral only when it conflicts with a higher moral imperative. An animal so poor in spirit that he won’t even fight on his own behalf is already an evolutionary dead end; the best he can do for his breed is to crawl off and die, and not pass on his defective genes.”-Robert A. Heinlein

“the carbon emissions we pump into our atmosphere, along with the pollution from chemical plants, over-fishing of the oceans and deforestation of old growth forests cannot be beneficial to our quality of life or the balance of the ecosystem.”

Not all man’s activities are harmful to the earth and its inhabitants. See this earlier post on WUWT, specifically my comment at “Roger Sowell (19:06:50) :”

In addition, for example, man built dams to prevent rivers from flooding. A side-effect is slower running rivers, with less erosion and scouring of the river bed. It also provides much more water for aquatic species, such as fish. Is that a harm or benefit?

Niagara Falls has much less water flowing, because a good portion of it is diverted to produce hydroelectric power. The result is the soft rock of the falls is not crumbling nearly as fast as it would otherwise. Is this a harm, or a benefit?

Man built bridges over rivers and chasms, allowing no contact with the river below while crossing. Is that a harm or benefit to the environment, from the perspective of the river?

Man cultivated vast areas of land, in the case of the U.S. Great Plains, busting up thick sod root systems to gain access to fertile soil. Such soil became home to billions of micro-organisms while useful crops are obtained. Is that a harm or benefit?

In Southern California and other areas of the dry West, since this was a desert before being settled, there were essentially no trees before building cities. Now, millions of trees, shrubs, and countless acres of grass and crops grow; all were planted by man. Is this a harm, or a benefit?

Please don’t believe all the garbage the environmentalists produce. Man has done, and continues to do, much that is good.

nakedwoadwarrior (14:40:22) :
I cannot believe that our destructive habits will have no effect upon our Blessed Planet.

This is the basis of the Green argument, in a nutshell. The rest is details.

From what I have read on this blog (and I could be wrong here) it seems to this layman that Watts and his guest writers are attempting to explain away global warming as a bunch of liberal huey.

Sometimes it does, but not everybody is on the same page, not everybody is using the same book. Instead of sticking to the science, some comment on the motivation of the other side. The funny thing is neither side understands the true motivation of the other. liberal huey, socialist huey, tax grab huey, big oil huey, libertarian huey, whatever, it’s irrelevant. What matters is one side thinks the planet needs saving and the other side doesn’t. That should not be a religious or political difference, that should be a scientific question that we should get an honest scientific answer to.

Reply: Before this gets perpetuated any longer, the word is hooey ~ charles the moderator

Another much larger example is that the GCMs would be unable to explain the causes of ice ages. Clearly the models need more work, and more funding.

Someone needs to tell them that Milankovich originally ran his model on paper with an ink pen by hand. Yes, it was pages and pages of calculations done by hand over several years that led him to his conclusion of what drove ice ages. By all reasonable standards, a mathematical model; just lacking the computer part. You could calculate the same thing on any modern PC in darned near no time. They are just shilling for cash. IMHO, of course.

AdrianS (10:03:15) : Did any of the Goverments/ Top Financial Institutions predict the Global Financial problems until it hit them ? No.
The financial system should be a much more easy to model system than climate and there would be loads of money and brain power swilling around the trough of banks/ big finance to try and make such predictions if it were possible.
But no, with all these highly paid financial whizz kids and the money put in no one said ” Houston, we have a problem”

But in fact, it was predicted. I watched at least three guests interviewed on financial news shows predict it and at least one fellow is known to have make a few hundred millions to low billions from shorting financial companies during the meltdown.

You are correct, though, that it was not predicted by a computer model, nor by a Government Agency, nor a major financial institution.

It was predicted by bright individuals using their brains and looking doggedly at the facts as everyone around them said “What problem? Every one agrees there is no problem. You must be wrong.” I think one of the predictors was name Peter Schiff see:

And I remember “Dr. Doom” of the Doom, Boom & Gloom (or some such ;-) report doing the same.

But the wiz kids with the financial models, yeah, they all thought that whatever was happening now would continue into the future. “Edge Effects” are one of the major failure modes of programs. They ignored edge effects. Brittle failure due to non-linear behaviours is another they don’t handle well. Oh, and they don’t do well with stochastic resonance either…

The present fiasco of the “Mark to Market” rule is a similar pile of brown doo. It works well to value a mortgage based on the price in the market most of the time when things are normal but it is incredibly stupid to require banks to say that all their mortgages must be valued at what they could be sold for today at 4 pm when the market is frozen. There is a stock trader saying “Markets can stay irrational longer than you can stay solvent” that the politicians are learning using your money… All that is needed to make the banks solvent again is to remove “Mark to Market” and go back to what the accounting rule was 20 years ago (or whenever some whiz kid got the bright idea of pretending to sell every home in America at the same time to get a value for mortgages…)

If the only homes that sold were foreclosures at 25% of loan value, does that really mean that all the other performing loans are now worth only 25% of loan value? That is what “Mark to Market” says…

Why would climate modellers be any better?

I would expect them to do worse, given their lousy input data and broken models (clouds, what clouds? GCR? Solar variation in TSI & Mag field and?)

I would also expect an experienced meteorologist using his brain and looking at the data to see the mistakes long before any computer model would “get it”.

To anyone who doesn’t know me: This is not computer model phobia on my part. I managed a supercomputer site doing plastic flow modeling and had a PhD run his cloud models on my machine for a few dozen runs. It is familiarly with computer models that leads me to this belief. They inform your ignorance more than they provide truth. They can let you vary one or two variables in a very very well understood simple system and get an idea what will happen; but they cannot tell you a thing about very complex systems that are poorly understood. The ultimate test of a new wing design is the airplane, and just before that, the wind tunnel… There is a reason they still use wind tunnels at NASA… And wing air flow is vastly more clearly understood than climate.

Computers do a really good job of repetitively doing very well characterized things over and over again. They are fairly lousy at doing vague things with any hope of any answer at all, and if you do get an answer, it is almost guaranteed to be wrong. Weather and Climate are only vaguely understood and have very non-linear non-deterministic behaviours. Models of that kind of system will fail. Often spectacularly, and most often at inflection points or other “edge effect” points.

Luckily, people handle those modestly well. That’s what experts with 40 years experience are for..

Maurice Garoutte (11:04:05) : Just use a neural net and the model can get very good results on the training data set. Even without a neural net it takes very little skill to tweak sensitivities and relationships to obtain good correlations with the training data set.

An example I used in class was a military project to spot Russian tanks. The neural net was trained and showed 100% reliability in spotting Russian tanks and discriminating them from U.S. tanks. Someone was either suspicious or curious so they started chopping up the pictures to see what part of the tank the neural net was using to make it’s decision. Eventually they chopped so small that the only thing left was some of the shrubs the tank had been near and it STILL had 100% accuracy at spotting the pictures that had had Russian tanks in them (but did not any more…)

Eventually they figured out that the neural net had learned to spot grainy vs non-grainy pictures. The Russian tanks were on high speed spy film while the U.S tanks were on fine grained pro stock …

IMHO, the Climate models have learned to see the CO2 grain… They have been tuned to “model the data”…

John Philip (12:35:05) : The midrange scenario A2 projects a temperature rise from 1990-2010 of 0.35C, equivalent to a linear increase of 0.175C per decade. How are they doing? Well the trends in the 4 main indices since 1990 are:

UAH 0.168
Hadley 0.171
GISS 0.183
RSS 0.185

Please explain what mathematical principle allows the accurate calculation of a temperature or anomaly to 1/1000 C from raw data originally recorded in whole degree F precision (and thus at most 1 F accuracy in the record) ?

All of these numbers are just playing in the error bands of the calculation of fantasies. With original data in whole degrees F, that is the most accuracy you can compute from it.

Do these observations increase or decrease our confidence in the models?

They cause my confidence to plunge to great depths. These folks can’t even master the point to “Never let your precision exceed your accuracy!. – Mr. McGuire” so I’m certain they will be oblivious to the subtleties of accuracy erosion in repeated data type conversion (as seen in GIStemp) and mixed data type math (as seen in GIStemp) and repeated chain calculations with floating point number types and a dozen other ways that computer calculations slowly lose the low order bits of precision.

When the don’t even know that they have no accuracy to the right of the decimal point, I can find no reason to trust that they have a clue what they are doing.

This thread is titled ‘If You Can’t Explain It, You Can’t Model It’, so I am surprised that nobody in the discussion has raised an issue I have repeatedly stated; viz. the lack of evidence for the basic assumption used in the climate models.

The climate models assume that change to climate is driven by change to radiative forcing. And it is very important to recognise that this assumption has not been demonstrated to be correct. Indeed, it is quite possible that there is no force or process causing climate to vary. I again explain this as follows.

The climate system is seeking an equilibrium that it never achieves. The Earth obtains radiant energy from the Sun and radiates that energy back to space. The energy input to the system (from the Sun) may be constant (although some doubt that), but the rotation of the Earth and its orbit around the Sun ensure that the energy input/output is never in perfect equilbrium.

The climate system is an intermediary in the process of returning (most of) the energy to space (some energy is radiated from the Earth’s surface back to space). And the Northern and Southern hemispheres have different coverage by oceans. Therefore, as the year progresses the modulation of the energy input/output of the system varies. Hence, the system is always seeking equilibrium but never achieves it.

Such a varying system could be expected to exhibit oscillatory behaviour. And, importantly, the length of the oscillations could be harmonic effects which, therefore, have periodicity of several years. Of course, such harmonic oscillation would be a process that – at least in principle – is capable of evaluation.

However, there may be no process because the climate is a chaotic system. Therefore, the observed oscillations (ENSO, NAO, etc.) could be observation of the system seeking its chaotic attractor(s) in response to its seeking equilibrium in a changing situation.

Very, importantly, there is an apparent ~900 year oscillation that caused the Roman Warm Period (RWP), then the Dark Age Cool Period (DACP), then the Medieval Warm Period (MWP), then the Little Ice Age (LIA), and the present warm period (PWP). All the observed rise of global temperature in the twentieth century could be recovery from the LIA that is similar to the recovery from the DACP to the MWP. And the ~900 year oscillation could be the chaotic climate system seeking its attractor(s). If so, then all global climate models and ‘attribution studies’( utilized by James Hansen, IPCC, CCSP and etc.) are based on the false premise that there is a force or process causing climate to change when no such force or process exists.

But the assumption that climate change is driven by radiative forcing may be correct. If so, then it is still extremely improbable that – within the foreseeable future – the climate models could be developed to a state whereby they could provide reliable predictions. This is because the climate system is extremely complex. Indeed, the climate system is more complex than the human brain (the climate system has more interacting components – e.g. biological organisms – than the human brain has interacting components – e.g. neurones), and nobody claims to be able to construct a reliable predictive model of the human brain. It is pure hubris to assume that the climate models are sufficient emulations for them to be used as reliable predictors of future climate when they have no demonstrated forecasting skill.

TonyB, you ask for a clevel schme to solve the inevitably problems caused by overpopulation? Well there is one idea as my little brother would tell you “Have the hungry eat the homeless.”. I hope you will forgive me for trying to inject a little humour into an otherwise serious discussion.

For a long time now there have been too many humans on the planet which has caused a lot more problems than just added CO2 emissions. For instance the distruction of natural habitat to make way for villages leading to animals such as tigers hunting closer to human populations thus increasing the chances of a hungry tiger catching an unsuspecting human child. This result in the tiger being blamed for being an evil man-eater that went out with the intention of hunting humans. Not the case i’m afriaid, it was just hungry and there were not natural prey animals left.

To bring it back to the topic at hand, the human blessing of sentience has resulted in humans constantly worrying about ther current situation. A ‘caveman’ worrying about being cold eventually led to the ‘invention’ of fire in a useful format which ultimatly led to the creation of central heating systems in modern day. Ancient nomadic populations got tired of following migratory animals to ensure a constant food supply and managed to learn the craft of farming which led to populations getting larger, villages turning into towns, and ultimatly cities as humans learned to be self sufficient. Humans worried about not being able to get around fast enough on horse back and eventually managed to perfect the internal combustion engine.

And now humans are worrying about climate change so they have invented (or are trying to invent) ways of predicting precisly what will happen in the near future. I wonder what will be invented to ‘cure’ the problems we ourselves have created? Will we eventually be going down the ‘Lost in Space’ route and simply looking for another planet to mess up in the name of self preservation?

Roads (08:19:26) :
“…Anthropogenic carbon emissions are increasing by 3% per year. CO2 concentrations are rising by around 2 ppm per year. CO2 concentrations have risen from c.280 to c.390 ppm …”

Perhaps I don’t understand how you arrived at the 3% increase/year. If the annual increase in CO2 ppm is 2, wouldn’t the increase as a per centage be (2/390*100=0.5%/year)? Much less than the claimed 3%.

Or, maybe alternatively, you are saying that – of the annual 2ppm – 3% is anthropogenic? If so, then 0.03*2=.06ppm (anthro) and the remanining 1.94 natural? What?

Ross
Just to clarify — you need to read the text.

Anthropogenic CO2 emissions are increasing at 3% per year.
Note that’s a statement about emissions and how they are changing over time.

CO2 concentrations are increasing at around 2ppm per year.
That statement records how CO2 concentrations are progressively rising.
The rise in CO2 concentrations which has occurred in a geologically instantaneous timescale since 1850 has resulted because a proportion of the additional CO2 added to the atmosphere by anthropogenic CO2 emissions (which themselves are rising, as above) can not be absorbed by natural sinks.

anna v (08:57:30) :Nothing will melt because of this small rise in night temperatures more than it has been melting ever since the little ice age.

All of the the available evidence from Arctic ice cover and glacial retreat indicates that’s clearly not true, and a cold northern hemisphere winter in 2008-9 does not change this reality. The present conditions result from a small and entirely predictable step on the overall warming curve, as clearly explained above.

Anthony is correct to remind us that solar effects have the power to influence global climate — indeed they have been doing so for over 4 billion years. Likewise CO2 concentrations have changed across the geological past in response to changes in temperature and the evolution and development of the plant ecosystem on this planet.

However, the geologically instantaneous rise in CO2 concentrations which has been observed since 1850 is unprecedented throughout the whole of that long Earth history. CO2 concentrations are rising progressively and rapidly at present since the scale and rapidity of anthropogenic CO2 input is beyond the capacity of natural processes to mitigate — and the ongoing destruction of equatorial forests further reduces that capacity.

The thermal effects resulting from those rapidly escalating CO2 concentrations are both predictable and manifestly observable in the rapid changes we have seen on this planet. No one is denying that the past winter has been cold, but that does not mean that global warming is not happening — far from it — and in fact the current conditions are exactly what we would look for to demonstrate that our combined CO2 / insolation models are correct.

Roads said: “…CO2 added to the atmosphere by anthropogenic CO2 emissions (which themselves are rising, as above) can not be absorbed by natural sinks.”

Unfounded opinion. Prof. Freeman Dyson explains that insufficient study has been done on this question, and that in all probability the biosphere is in fact adjusting by taking advantage of the additional availability of plant food.

Observed global warming, as minor as it is, is primarily the result of the planet emerging from the last Ice Age. The effect of CO2 on temperature is much, much smaller than the IPCC claims, and it is clearly overwhelmed by other factors.

Finally, the ‘rapid changes’ referred to are primarily the result of more attention being focused on the [repeatedly falsified] AGW/CO2 hypothesis, and an increase in the number of recordings being taken. If these ‘rapid changes’ were due to rapidly escalating CO2 concentrations, then with all this putative rapidity, the planet’s temperature should be rising fast. Instead it is declining, therefore CO2 can not be the culprit. QED.

I’m not clear why people keep bringing up scenarios B and C. They were based on much lower CO2 levels and are not valid. (By plugging in carefully chosen inputs, you could make the models do anything you want.)

The only scenario which uses values close to current CO2 levels is A, and that one forecast much more warming than has been observed.

I’m not clear why people keep bringing up scenarios B and C. They were based on much lower CO2 levels and are not valid. (By plugging in carefully chosen inputs, you could make the models do anything you want.)

The only scenario which uses values close to current CO2 levels is A, and that one forecast much more warming than has been observed.

i answered your question: what model shows cooling. so have you calculated some of the TREND lines i gave you?

the models show cooling periods, that are similar to this one. that is a FACT.

the Hansen models are from 1988. they aren t perfect, but reasonably good. that is another fact…

The model can show anything you want if you use irrelevant input data. Your argument makes no sense, because the assumptions for scenario B are not based on what has actually happened to atmospheric CO2. Plug the right numbers in and you can get an ice age if you like.

With current CO2 levels, Hansen’s model shows nearly 1C of warming since 1990. (That is what scenario A is.) Even irrelevant scenario B shows nearly 1C warming over the last decade since 1998.

There is a reason why Hansen keeps forecasting the hottest year ever nearly every year. It is not because he expects cooling.

However, the geologically instantaneous rise in CO2 concentrations which has been observed since 1850 is unprecedented throughout the whole of that long Earth history.

So, if I understand correctly, you are saying that the Earth’s geologic history of billions of years is so well known that the change in CO2 in the last 158 years is unprecedented for any other given 158 year period?
Somehow I doubt that assertion.

nakedwoadwarrior (14:40:22)Either way one looks at the issue, the carbon emissions we pump into our atmosphere, along with the pollution from chemical plants, over-fishing of the oceans and deforestation of old growth forests cannot be beneficial to our quality of life or the balance of the ecosystem.

I admire that this blog is seeking to present the other side of the debate, but I cannot believe that our destructive habits will have no effect upon our Blessed Planet.

First, it is great that you are attempting to look at both sides of “the issue”. But what, exactly do you think is the issue? Because it appears to me you have allowed yourself to be hornswaggled by the Alarmist side into believing that we Climate Realists do not care about the other issues you mention, such as pollution, deforestation, over fishing, etc. That is a complete lie, and one of only many fibs they tell.
It is all a big ruse, in an effort to steer your attention away from the ACTUAL issue, which is whether or not C02, and specifically man’s contribution to C02 is driving, or indeed has ever driven climate change, and if so, to what extent? The answer, you will find, is that C02 on the whole has only a minor effect on climate, and man’s contribution of C02 has even less effect. Its biggest effect, in fact is the extremely beneficial one of increased plant growth.
The political stuff is basically a sideshow, yet important because of the harm already being caused, and the ENORMOUS HARM which will be perpetrated on already-weakened economies FOR NO REASON. This is not a Liberal vs Conservative issue, indeed many of us come from Liberal backgrounds, and only investigated because we were curious, and interested in the science. I urge you to do the same. For starters, if you haven’t already, check out Lucy Skywalker’s site, which explains things in laymen’s terms, and with some further good sites linked.

The model can show anything you want if you use irrelevant input data. Your argument makes no sense, because the assumptions for scenario B are not based on what has actually happened to atmospheric CO2. Plug the right numbers in and you can get an ice age if you like.

Steven you asked a clear question:

Which model predicted that?

i gave you a xclear answer: the models DO show such time spans. it is a fact.

your answer:

The only scenario in Hansen’s graph you linked which showed a seven year cooling trend was Scenario C, which is based on low CO2 increase. The actual CO2 increase has been higher than scenario A.

and this is false. scenario B shows long cooling periods as well.

now you are trying to shift the discussion to the differences between models A, B and C.

but those are irrelevant to your question and to your claim. i answered your question and i contradicted you claim. fact.

Since there appears to be a bit of down-time in the urgent exchange of cutting-edge science information, I thought a follow-up report about the little purple birds was in order. After the sad report of the freezing death of numerous little purple birds (LPBs) in a blizzard while heading toward Milwaukee, localities in SW Missouri have reported “scads of them” left over. Indeed, the Conservation Status of this species is (IUCN 3.1) Least Concern, or, for those unfamiliar with this scale, three whole levels above American taxpayers.

Jubilant carwash owners have taken to leaving bowls of highly colored berries for the birds to eat, thus preserving this species, and maximizing carwash profits. LBPs have been also called purple finches, the state bird of New Hampshire. Shortly after being named the NH state bird, the birds’ numbers plummeted, causing alarmist birders to blame the less flamboyant and harder-working House Finch of competitive endangerment. This was accompanied by a mysterious change in the coloration of the LPB females’ plumage toward distinctly greenish hues. Traditionalist birders noted that as the females became more green, they became shriller and much more irritable, thus preventing successful mating. Gender confused males, stumped by the color changes, have been known to wander helplessly in urban areas, voicing their mournful cry “Baaar-ney Fraaank!”

Knowledgeable zoologists have assured the anxious populace that the LPBs are successfully migrating toward Milwaukee in time for St Patrick’s Day, and should return to Missouri and the safety of their Bible Belt winter habitat very soon, “to sober up.” Another global warming crisis has been narrowly averted.

Hey thanks for the welcome! I do find this blog interesting and shall have to continue reading it.

Forgive my ignorance on this issue, as I said I am not a scientist by degree but rather by praxis in that I am always “seeking to know”.

I apologize if my lack of knowledge on the issue has caused me to come off as espousing “Alarmist” ideologies. Of course, the “Alarmist” side has been the only one I have been exposed to as of yet, so I am interested to learn the science behind the “Realist” “side”.

While I am interested in science, my Earth-centric religious views also provide impetus for gleaning further knowledge of the issues at hand.

I look forward to reading more, and once again, thank you all for welcoming me to the discussion!

Hey, WoadWarrior!
I’m glad you found us, too. Don’t be concerned about not having the same decision-making information as some others here. It will come by osmosis, just reading here. The bottom line is, doubt everything Authority tells you until you can verify it. Pretty simple, really. So, welcome.
One more thing, nakedwoadwarrior. When one of your crazy buddies tries to talk you into a Mel-Gibson-total-blue-body-paint-outfit for Halloween, smack him in the head. Woad is quite astringent. It shrivels things, nakedwoadwarrior. Things you don’t want shriveled. Words to the wise.
Regards, Henry Phipps, MD

“The climate system is seeking an equilibrium that it never achieves. The Earth obtains radiant energy from the Sun and radiates that energy back to space. The energy input to the system (from the Sun) may be constant (although some doubt that), but the rotation of the Earth and its orbit around the Sun ensure that the energy input/output is never in perfect equilibrium.

“The climate system is an intermediary in the process of returning (most of) the energy to space (some energy is radiated from the Earth’s surface back to space). And the Northern and Southern hemispheres have different coverage by oceans. Therefore, as the year progresses the modulation of the energy input/output of the system varies. Hence, the system is always seeking equilibrium but never achieves it.

“Such a varying system could be expected to exhibit oscillatory behaviour. And, importantly, the length of the oscillations could be harmonic effects which, therefore, have periodicity of several years. Of course, such harmonic oscillation would be a process that – at least in principle – is capable of evaluation.

“However, there may be no process because the climate is a chaotic system. Therefore, the observed oscillations (ENSO, NAO, etc.) could be observation of the system seeking its chaotic attractor(s) in response to its seeking equilibrium in a changing situation.”

I’m thankful to read that, because it provides thoughtful and well-informed backing for the hand-waving guess I posted yesterday:

“Indeed, it seems to me that there needn’t be any external cause for the variation–variation might be generated internally as the means by which a system’s stability is attained. A tight-rope walker maintains his balance because he constantly counterbalances himself back-and-forth against his balance-pole. If he were forced to eliminate his wobbling and walk steadily, he would fall. (I owe this insight to an essay by a Hungarian author in a book of his odd-ball essays published within the last ten years whose title I’ve forgotten.) Similarly, in a dynamical system with long-term feedback loops and natural counter-balances and counter-counter-balances, multi-decadal zigs and zags are probably (IMO) part of an overall equilibrium.”

Given the seven year drop in temperatures, I take it that you are predicting a rapid rise over the next year or two to get back on the scenario A curve?

For new readers, and ‘dealing with reality’ in 1988 Dr Hansen testified to Congress on global warming and as part of the presentation submitted projections of the change in global temperatures from an early version of the NASA climate model under three scenarios, A,B,C, which can be simplified to ‘High’, Medium’ and ‘Low’ [This is a gross simplification – see the original paper for details]. In practice the temperature projections under Scenario B has proven to be extraordinarily accurate – the difference in trends between the projected and the actual trend is actually less than the observational uncertainty, or to put it another way, a ‘perfect’ model would not have done better.

Now in some quarters this evidence of the predictive skill of a climate model has not gone down well, and there have been some attempts to rewrite history, notably by Michael Crichton and also over at Climate Audit, to the effect that Hansen actually believed Scenario A was most likely, or Mr Goddard’s claim that A most accurately tracks the actual forcings. Fortunately we have Hansen’s testimony …

These scenarios are designed to yield sensitivity experiments for a broad range of future greenhouse forcings. Scenario A, since it is exponential, must eventually be on the high side of reality in view of finite resource constraints and environmental concerns even though the growth of emissions in scenario A (~1.5%/yr) is less than the rate typical of the past century (~4%/yr). Scenario C is a more drastic curtailment of emissions than has generally been imagined. It represents elimination of chlorofluorocarbon (CFC) emissions by 2000, and reduction of CO2 and other trace gas emissions to a level such that the annual growth rates are zero (i.e. the sources just balance the sinks by the year 2000. Scenario B is perhaps the most plausible of the three.

Pat Micheals also testified to Congress and in doing so simply erased scenarios B and C, misrepresenting Hansen as an exaggerator. Hansen was not impressed and responded

One of the skeptics, Pat Michaels, has taken the graph from our 1988 paper with simulated global temperatures for scenarios A, B and C, erased the results for scenarios B and C, and shown only the curve for scenario A in public presentations, pretending that it was my prediction for climate change. Is this treading close to scientific fraud?

I know my answer to that particular question. Now, NASA have published estimated radiative forcings up until 2003 and comparing these to the scenarios show that Scenario B was easily the closest match to reality, tracking about 10% higher than observed – while A was 25% higher and C about 25% lower. CO2 has increased slightly faster than in scenario B, but this is offset by a reduction in CFCs thanks to the Montreal Protocols and a plateauing of methane concentrations, for reasons largely unknown.

So, to answer the question, no I am not expecting a reversion to Scenario A anytime soon.

Footnotes: You can download the scenario and forcing data from this page.

The 1988 projections are probably reaching the end of their useful lifetime – a key parameter, climate sensitivity, was set to a value about a third higher than our current best estimate. Over the first few decades this has little impact, but over time will have the effect of causing the model to overestimate temperatures, if the modern value is more accurate. Seven year and longer flat or cooling trends are not unusual in Scenario B, still the planet warms.

You wrote : Scenario A, since it is exponential, must eventually be on the high side of reality

To date, Scenario A is on the low side of CO2 emissions, as Hansen et al constantly remind us. CO2 has risen faster than Scenario A, and temperatures have risen much slower. The fact that 30 years ago he considered scenario B CO2 to be the most plausible is irrelevant, because atmospheric CO2 has risen much faster than he expected and his prediction was incorrect.

Alarmists can’t have it both ways – claiming that lower scenarios are valid while simultaneously forecasting a 6+C rise in temperatures and “global warming much worse than predicted.” That is exactly the marketing scam which people in Copenhagen are trying to pull off.

The model forcings are for all GHGs combined, not just CO2. Please provide a single reference to support your assertion that Hansen, or anyone else has claimed that the actual forcing trajectory are higher than Scenario A.

In 2006 Hansen wrote:

Real-world GHG climate forcing so far has followed a course closest to scenario B. The real world even had one large volcanic eruption in the 1990s, Mount Pinatubo in 1991, whereas scenario B placed a volcano in 1995.

Given the opportunity to predict three highly diverging scenarios in the 1980’s does not give credibility to the prognosticator in 2009, when one of the three scenarios is still inaccurate, but less inaccurate than the other two.

Face it, Hansen was wrong then and he is wrong now. And 250 more words in response won’t change that fact.

Global emissions of carbon dioxide are increasing three times faster than scientists previously thought, with the bulk of the rise coming from developing countries, an authoritative study has found.

The increase in emissions of the gases responsible for global warming suggests that the effects of climate change to come in this century could be even worse than United Nations scientists have predicted.

The report, by leading universities and institutes on both sides of the Atlantic, will create renewed pressure on G8 leaders who are meeting this week in Heiligendamm, on Germany’s Baltic coast.

Top of the agenda are proposals by Angela Merkel, the German Chancellor, to halve global emissions by 2050.

There were violent clashes at the weekend in the nearby city of Rostock between police and protesters during a march by tens of thousands demonstrating about the summit.

The latest study was written by scientists from the Oak Ridge National Laboratory in the United States, the University of East Anglia and the British Antarctic Survey, as well as institutes in France and Australia.

It shows that carbon dioxide emissions have been increasing by three per cent a year this decade, compared to a 1.1 per cent a year rise in the 1990s. Three quarters of this rise came from developing countries, with a particularly rapid increase in China.

The rise is much faster than even the most fossil-fuel intensive scenario developed by the Intergovernmental Panel on Climate Change (IPCC) during the 1990s.

That report found emissions rose by 1.1% per annum during the 1990s rising to 3% this decade, Hansen’s scenario A used a 1.5% per annum increment, so taking into account the ‘compound interest’ effect, overall scenario A is still way higher than observed. Singling out CO2 for a moment, scenario A projected a concentration in 2008 of 464ppm, the actual was 386, so for this GHG scenario A is a 20% overestimate.

So what exactly is the origin of this idea:

To date, Scenario A is on the low side of CO2 emissions, as Hansen et al constantly remind us. CO2 has risen faster than Scenario A, and temperatures have risen much slower. ?

Bill – I assume you aware that the GISS and scenario anomalies use different baselines? It makes a slight difference, also rather then using a single (La Nina) year for your comparison, try looking at the trends

According to the UAH data, the lower troposhere is now about 0.2 deg C warmer that it was 30 years ago. How does this increase match with Hansen’s predictions? (I am assuming that the most recent 30 years qualifies the observations as climate rather than weather).

Bill – Both your statements are wrong, but the difference between right and wrong has little effect on the outcome (<0.1C), which is why I said it only makes a slight difference above.
Hansen’s scenario baseline for the anomalies (or ‘Control Year’ as it is called in the paper) is actually 1958:

The zero point for observations is the 1951-1980 mean and the zero point for the model is the control run mean (Caption to Fig 3). Different baseline, in other words.

And the observational dataset used by the paper was an early iteration of GISTEMP and so GISTEMP also uses the mean of the years 1951-1980 as its baseline.

“I used to think GCMs were Global Climate Models; but then I kept reading papers from people who presumably know better; and they were calling them Global Circulation Models; NOT Climate models.

Well on planet earth it seems that at reasonable (non-geological) time scales, we have basically Ocean Waters, Atmosphere, and perhaps energy that are capable of Circulating. At longer geological time scales, the land itself is circulating; so let’s not go there.

Well it seems to me that in order to get circulation of either water, or atmosphere or energy, you MUST have differences in Temperature both in time and in space.

But limate we are told (by definition) is the long term average of weather. Therfore it is ideally a single number; not a graph, which implies changes over time or space; which would be weather rather than climate.

So climate and circulation seem to be mutually incompatible concepts. You can’t get circulations or model them, without having temperature (and other) differences in time and space, and that means averaging is verboten, because it eliminates differences.

A model that depicts the earth as an isothermal body at about 15 deg C, that radiates 390 Watts per square meter from every point on it 24 hours a day doesn’t have any temperature differences in time or space to use with the laws of Physics to ompute circulations.”

You are presenting a straw man argument here. Climate models do not assume that the earth is at a uniform temperature etc. The method is to use an ensemble of models that have different intial conditions close to the actual initial conditions.
Because of the chaotic nature a distribution of initial conditions must be used.
Each simulation proceeds to calculate the trajectory of conditions which certainly do vary around the world. The results are summarized as a global average versus time for some purposes. If you believe a different sort of result should be reported, please say what that should be.

[/quote]
Yet an untoward focus ofclimate science, and certainly climate politics rests on what happens to a single number that gets updated each year or maybe monthly or maybe each five years; namely the GISStemp anomaly fo Dr James Hansen at NASA.

Do any of thse vaunted GCMs whatever they are, predict both the mediaeval warm period, and the little ice age; the Maunder and dalton minima; or any other widely recognized instance of a climate change, that presumably had a natural reason.[/quote]
In order to do that they would have to have and accurate idea of what the climate forcings were in that time frame and a good set of global initial conditions.
The solar output is not well known, and volcanic action can only be quessed at.
I don’t see how that question is relevant to the utility or validity of the models in modern times.

“As for the good Met Office lady’s request for more powerful computers (and any other form or research (taxpayer) dollar expenditures; Nonsense; those more powerful computers will simply spit out garbage at a much faster clip; but it will still be garbage, because clearly the models are NOT adequate to explain what is happening, because there isn’t even scientific agreement, on what the key Physical processes even are; let a lone any experimental verification of the extent to whih any such processes are operating.

The only climate or circulation model that works; and it works very well, is the model that is being run on the data, by planet earth with its oceans, and atmospheres, and the physical geography of its land masses; not to mention all the biological processes going on at the same time. Well then there’s the sun of course; which ranges from the cause of everything; to having nothing to do with earth’s climate.

When the GCMs make even a token effort to emulate the model that planet earth is running; they might start to get some meaningful results.”

The do emulate the model that the planet earth is running. They use a mixture of empirically derived and physical models. The modelers do not claim complete accuracy. They provide an range of values and average trends as the final output.

“But I can assure you that faster computers are neither the problem nor the solution. Computers can be made to do the stupidest things at lighning speed; the trick is to have them not doing stupid things, in the first place..

The old astronomy professor was addressing his brand new freshman astronomy class for their first lecture.

“when I was an undergraduate like you all, ” he said, “we could write down the universal equations of motionon the back of a postage stamp; but it took us six months with a battery of students working with log tables to solve those equations, and determine the motions of the universe..”

“Now that has all changed; we have modern high speed computers that can solve those dynamical equations in microseconds; but now takes a battery of students six months to program those computers with meaningful code to even start working on those problems of the universe.”

I know a little bit about computer modelling. I spend hours each day, modelling optical systems with a quite powerful yet desk top computer (eight Intel Processors). Well there are a number of other people in the company, that have the same lens design software, and maybe not quite so powerful computers; who presumably could do the same things I do, with my computers; but they can’t; and they don’t.

You see they are all mechanical packaging engineers; who have been told that optics is just packaging with transparent materials. They can’t do what i do, because they don’t know a darn thing about the Physics of Optical systems. What’s more, they don’t know that that is why they can’t do what i do; it sin’t my eight core computers.”
Are you claiming that scientists who have PhD.’s and study this problem for a living are not as smart as you are? That they don’t know a darn thing about the earth’s climate, and don’t know how to program a computer? Where is the evidence of that?
Your old astronomy professer obviously was a fossil who was given undergraduate courses to teach because they didn’t trust him to mentor graduate students.