Quote of the week: a howler from the World Meteorological Organization – what warming?

Gosh, you’d think they’d check the data before issuing a statement like this (press release follows).

It [CO2] was responsible for 85% of the increase in radiative forcing – the warming effect on our climate – over the decade 2002-2012. Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration (NOAA).

But, the temperature data tells an entirely different story, look at this plot of all global temperature metrics and trends from 2002-2012 – there’s no warming to be seen!

In fact, with the exception of UAH, which is essentially flat for the period, the other metrics all show a slight cooling trend.

CO2 concentrations top 400 parts per million throughout northern hemisphere

Geneva, 26 May 2014 (WMO) – For the first time, monthly concentrations of carbon dioxide (CO2) in the atmosphere topped 400 parts per million (ppm) in April throughout the northern hemisphere. This threshold is of symbolic and scientific significance and reinforces evidence that the burning of fossil fuels and other human activities are responsible for the continuing increase in heat-trapping greenhouse gases warming our planet.

All the northern hemisphere monitoring stations forming the World Meteorological Organization (WMO) Global Atmosphere Watch network reported record atmospheric CO2 concentrations during the seasonal maximum. This occurs early in the northern hemisphere spring before vegetation growth absorbs CO2.

Whilst the spring maximum values in the northern hemisphere have already crossed the 400 ppm level, the global annual average CO2 concentration is set to cross this threshold in 2015 or 2016.

“This should serve as yet another wakeup call about the constantly rising levels of greenhouse gases which are driving climate change. If we are to preserve our planet for future generations, we need urgent action to curb new emissions of these heat trapping gases,” said WMO Secretary-General Michel Jarraud. “Time is running out.”

CO2 remains in the atmosphere for hundreds of years. Its lifespan in the oceans is even longer. It is the single most important greenhouse gas emitted by human activities. It was responsible for 85% of the increase in radiative forcing – the warming effect on our climate – over the decade 2002-2012.

Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration (NOAA).

According to WMO’s Greenhouse Gas Bulletin, the amount of CO2 in the atmosphere reached 393.1 parts per million in 2012, or 141% of the pre-industrial level of 278 parts per million. The amount of CO2 in the atmosphere has increased on average by 2 parts per million per year for the past 10 years.

Since 2012, all monitoring stations in the Arctic have recorded average monthly CO2 concentrations in spring above 400 ppm, according to data received from Global Atmosphere Watch stations in Canada, the United States of America, Norway and Finland.

This trend has now spread to observing stations at lower latitudes. WMO’s global observing stations in Cape Verde, Germany, Ireland, Japan, Spain (Tenerife) and Switzerland all reported monthly mean concentrations above 400 ppm in both March and April.

In April, the monthly mean concentration of carbon dioxide in the atmosphere passed 401.3 at Mauna Loa, Hawaii, according to NOAA. In 2013 this threshold was only passed on a couple of days. Mauna Loa is the oldest continuous CO2 atmospheric measurement station in the world (since 1958) and so is widely regarded as a benchmark site in the Global Atmosphere Watch.

The northern hemisphere has more anthropogenic sources of CO2 than the southern hemisphere. The biosphere also controls the seasonal cycle. The seasonal minimum of CO2 is in summer, when substantial uptake by plants takes place. The winter-spring peak is due to the lack of biospheric uptake, and increased sources related to decomposition of organic material, as well as anthropogenic emissions. The most pronounced seasonal cycle is therefore in the far north.

The WMO Global Atmosphere Watch coordinates observations of CO2 and other heat-trapping gases like methane and nitrous oxide in the atmosphere to ensure that measurements around the world are standardized and can be compared to each other. The network spans more than 50 countries including stations high in the Alps, Andes and Himalayas, as well as in the Arctic, Antarctic and in the far South Pacific. All stations are situated in unpolluted locations, although some are more influenced by the biosphere and anthropogenic sources (linked to human activities) than others.

The monthly mean concentrations are calculated on the basis of continuous measurements. There are about 130 stations that measure CO2 worldwide.

A summary of current climate change findings and figures is available here

135 thoughts on “Quote of the week: a howler from the World Meteorological Organization – what warming?”

During the 61-year period, in correspondence with the rise in CO2 concentration, the global average absolute humidity diminished about 1 per cent. This decrease in absolute humidity has exactly countered all of the warming effect that our CO2 emissions have had since 1948.

Similar computer simulations show that a hypothetical doubling of the carbon dioxide concentration in the air would cause a 3% decrease in the absolute humidity, keeping the total effective atmospheric greenhouse gas content constant, so that the greenhouse effect would merely continue to fluctuate around its equilibrium value. Therefore, a doubling of CO2 concentration would cause no net “global warming” at all.

REPLY: Yeah, Nick either can’t help himself, as his many years of working for CSIRO has produced an institutionalized reaction to anything contrary to the monthly newsletter, or he’s simply paid to come here and sow obfuscation. Given he’s often one of the earliest commenters for anything contrary to his world view, I expect he has a trigger mechanism setup to alert him so he can derail threads early on with his particular brand of diversion.

Bottom line: increased CO2 forcing with no resulting increase in temperature, means no warming, and WMO believes there was warming. Bad science, just PR – no cookie. – Anthony

I comprehend Mr. Stokes comment — but if an increase in a forcing is NOT producing an increase in temperature, then somewhere exists a cooling effect that is balancing the forcing such that the “net” is mostly unchanging. It’s not just that the heat is hiding in oceans, something must be exerting a cooling effect, and it may well be the very same CO2 at high altitude whose increased abundance means more radiation into space.

Or something like that. I’ll let the scientists figure it out BUT the claim was global WARMING, not global FORCING.

Poor, poor, AGW (sad head shake)…. dead on arrival. But, …… if little children everywhere …..will only BELIEVE,….. just believe in CO2 being a significant global climate forcing fairy of great power, AGW will live!

Hi, Wayne, hopefully, just gradual warming (from all the hot air folks like Stokes puff out — I don’t care if he’s in Australia (have no idea where that man is coming from, lol) that much heat’ll get to Canada by the end of the month, heh.

a) Technically they have their rear ends covered, since the wording of the press release is an “increase in radiative forcing” from “greenhouse gases” over that time period, which can occur even during a period of cooling if such is overwhelmed by other factors. If I shine a flashlight on a block of wood while starting to pour liquid nitrogen on the wood, technically the flashlight has a warming effect in itself (just overwhelmed by the latter).

b) They are misleading to the public by helping more people falsely believe there was warming 2002->2012, furthering their activist goals, but in a manner such that they are extra certain to avoid getting in any trouble for it.

Besides, it is probably only a matter of time before some of those temperature datasets get rewritten enough to make 2002->2012 become warming. As implied in an article here yesterday, HADCRUT4, for example, already has hit about 0.3+ degrees of divergence from satellite temperature data in how warm it is making 2014 compared to 1998, which suggests they may be planning to adjust the data until they can trumpet 2014 as a record warm year. This is the endgame, the final time before the Solar Grand Minimum hits, and some of the more cunning environmentalists know it, their last chance to go all out; as illustrated in my usual http://tinyurl.com/nbnh7hq set of illustrations (with little adjusted data needed and little mutilating of the data through opaque computational operations first, just rawer data), claims about relevant climate history being unrelated to the sun depend upon presenting data only in a messed-up manner.

[NOTE: I recommend to readers that you do NOT click the link above, the free hosted website he uses pops up all sorts of parasitic nastiness, and won’t let you back out of it. You have to close the browser – Anthony]

Also, while the much-revised NODC ocean heat content data for 0-2000 meters might show warming globally, it shows very little warming for the Northern Hemisphere oceans since 2005. See Figure 1. Only about 7% of the warming of ocean heat content for the depths of 0-2000 meters occurred in the Northern Hemisphere from 2005 to 2012, yet the surface area of the Northern Hemisphere oceans represents about 43% of the surface of the global oceans.

Can well-mixed human-created greenhouse gases pick and choose between the hemispheres, warming one but not the other? … unlikely.”

26 May: WSJ: The Myth of the Climate Change ‘97%’
What is the origin of the false belief—constantly repeated—that almost all scientists agree about global warming?
By Joseph Bast And Roy Spencer
The “97 percent” figure in the Zimmerman/Doran survey represents the views of only 79 respondents who listed climate science as an area of expertise and said they published more than half of their recent peer-reviewed papers on climate change. Seventy-nine scientists—of the 3,146 who responded to the survey—does not a consensus make…
In 2013, John Cook, an Australia-based blogger, and some of his friends reviewed abstracts of peer-reviewed papers published from 1991 to 2011. Mr. Cook reported that 97% of those who stated a position explicitly or implicitly suggest that human activity is responsible for some warming. His findings were published in Environmental Research Letters.
Mr. Cook’s work was quickly debunked…
Of the various petitions on global warming circulated for signatures by scientists, the one by the Petition Project, a group of physicists and physical chemists based in La Jolla, Calif., has by far the most signatures—more than 31,000 (more than 9,000 with a Ph.D.). It was most recently published in 2009, and most signers were added or reaffirmed since 2007. The petition states that “there is no convincing scientific evidence that human release of . . . carbon dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate.”
We could go on, but the larger point is plain. There is no basis for the claim that 97% of scientists believe that man-made climate change is a dangerous problem.http://online.wsj.com/news/articles/SB10001424052702303480304579578462813553136?mg=reno64-wsj&url=http%3A%2F%2Fonline.wsj.com%2Farticle%2FSB10001424052702303480304579578462813553136.html

“Oh, bullshit, you left out this part: “the warming effect on our climate – over the decade 2002-2012″. Nick Go obfuscate the truth someplace else. Now I’m SURE you are a paid troll. – Anthony”

You mean like how he in a previous posting quoted Dr Roy Spencer to imply skeptics were cherry picking. Then when I followed his link, I noticed he left out the part of Dr Roy Spencer’s text that directly contradicted him. And when I pointed this out and included the full (relevant) text, Nick didn’t admit to the error but obfuscated further in a rather sleazy way. When you go down that path it stops being an honest mistake and starts being blatant lying.

How is that possible? Water vapor is the primary ‘greenhouse’ gas and I don’t think it’s atmospheric concentration changed much. And CO2 levels only increased about 10%.

So where does the 34% number come from?

And remember, the formulas the warmists use to convert the increase in CO2 concentration into radiative forcing are logarithmic. So the radiative forcing increases much slower than the CO2 concentration.

Yes….fully understand. It created the radiative forcing….which forced nothing at all. Which means that the quantity of forcing is utterly miniscule, or the natural forcing is AT LEAST as much. Either way, anybody with common sense will see that alarming language is alarming language.

They are desperate to drum up fear. They must be in a near state of panic that it isn’t happening. Our fear, that is. Oh, and the warming. And the gravy-train slowing to a halt. And – for some at least – their plan for world domination spiraling down the drain. I’ve never known a bunch to sooo want worldwide destruction and unrest.

“I recommend to readers that you do NOT click the link above, the free hosted website he uses pops up all sorts of parasitic nastiness, and won’t let you back out of it. You have to close the browser”

If so on your computer and whatever browser you are using, that would be curious, as my Firefox just gets one readily-closed ad from the imagevenue hosting site referenced in my tinyurl link. (If you get popup ads on many new website visits, you might need to google and use malwarebytes, not for imagevenue in particular but for a pre-existing problem in general).

However, I also posted the image now at the well-known ad-free WebCitation website as an alternative, with the image appearing small but enlarging on further click:

Re David says:
How is that possible? Water vapor is the primary ‘greenhouse’ gas and I don’t think it’s atmospheric concentration changed much. And CO2 levels only increased about 10%.

So where does the 34% number come from?
==================================
David, I think this…”Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration” may mean anthropogenic forcing, but certainly cannot mean 34% of the estimated affect of all earth’s GHGs. But I do not know. I do know those climate scientists can get very “Alice in Wonderland” like.

Once again its worth remembering that ‘value’ of the message is not in its fact, but in the impact it has despite the facts . To be fair the one area where climate ‘science’ is a leading force is the ‘science by press release’ one .Which given it does not required, good data , honesty nor good scientific pratice is ideal for the norms and standards of this area.

Native sinks don’t balance native sources, native sinks were always larger than native sources over the past 50+ years:
where 1 ppmv = 2.13 GtC.
Thus the net contribution of native sources and sinks is negative over the past 50 years.
Human emissions are now over 9 GtC/year and still increasing while the increase in the atmosphere is following with 50-55% of human emissions. The huge variability in rate of change (in fact uptake) of CO2 is caused by the huge variability in year-by-year temperature, but the trend is not caused by temperature. It is caused by the human emissions.

The alternative theories like these of Salby fail one or more observations and thus are proven wrong.

It seems that the scientific method is dead. We have a hypothesis that CO2 causes warming on the planet earth. So mother earth runs an experiment for us. We have a great increase in CO2 while at the same time we have no warming or even cooling on the planet — that according to the cheaters “adjusting” the data upwards. In the old days, we would say that we have a falsification of the hypothesis. But in these modern times of “climate science” (post-modern anti-science to me) we have faith in warming even when it gets colder.

This WUWT post is just one more data point among all the others.

(note: CO2 always follows warming in the distant past which should have told us all something as well)

[NOTE: I recommend to readers that you do NOT click the link above, the free hosted website he uses pops up all sorts of parasitic nastiness, and won’t let you back out of it. You have to close the browser – Anth–]

FWIW:
I opened it in a new tab and had absolutely no problem. Firefox on max recure browsing, no cookies and with addblock active, as I always browse. (The only sensible way?)

The figure titled: “Preliminary CO2 mole fractions at the GAW global stations (March 2014; April 2014)” shows numbers for CO2 around 400. A MOLE FRACTION is the ratio of the moles of a certain species to the total number of moles. The sum of all mole fractions =1. At least that’s what it means in chemistry. So, what do they mean in this chart?

Salby’s hypothesis and Humlum et al paper phase relationship CO2 – The majority of the increase in atmospheric CO2 in the last 70 years was due to an unknown mechanism rather than anthropogenic emission.

If there is unequivocal cooling we will have a chance to see by observations if Salby’s hypothesis that a significant portion of the CO2 rise in the last 70 years was due to a change in natural emissions rather anthropogenic CO2 emissions.

https://wattsupwiththat.com/2013/06/10/dr-murray-salby-on-model-world-vs-real-world/
if I understand the mechanisms and based on cycles of similar warming and cooling that correlate with solar magnetic cycle changes roughly 90% of the warming (90% of 0.8C which is 0.7C) in the last 150 years was due to solar magnetic cycle modulation of planetary cloud cover.
The following is Humlum et al’s paper that supports Salby’s hypothesis.

http://www.tech-know-group.com/papers/Carbon_dioxide_Humlum_et_al.pdf
The phase relation between atmospheric carbon dioxide and global temperature
Summing up, our analysis suggests that changes in atmospheric CO2 appear to occur largely independently of changes in anthropogene emissions. A similar conclusion was reached by Bacastow (1976), suggesting a coupling between atmospheric CO2 and the Southern Oscillation. However, by this we have not demonstrated that CO2 released by burning fossil fuels is without influence on the amount of atmospheric CO2, but merely that the effect is small compared to the effect of other processes. Our previous analyzes suggest that such other more important effects are related to temperature, and with ocean surface temperature near or south of the Equator pointing itself out as being of special importance for changes in the global amount of atmospheric CO2.

Salby’s hypothesis and Humlum et al paper phase relationship CO2 – The majority of the increase in atmospheric CO2 in the last 70 years was due to an unknown mechanism rather than anthropogenic emission.

Well, not really.The increase in atmospheric CO2 in the last 70 years was due to any one or a combination of some of many mechanisms one of which may have been the anthropogenic emission.

Salby says little which was not in one of our 2005 papers.
I refer you to the thread on a recent paper because it discusses these matters. And I especially point to this comment in it.

William, Salby (and others) make the fundamental mistake to assume that the year-by-year variability in CO2 increase rate (in fact the variability in natural sink rate) and the longer term trend both are caused by the same mechanism.
That is proven wrong: the short term variability is the direct effect of temperature changes on (land) vegetation, while the long term increase is not caused by temperature: vegetation in general grows harder (and over a larger area) with higher temperatures and more CO2. Thus is not the cause of the increase in the atmosphere. For the short term:
which shows that the short term influence of temperature gives more CO2 but less δ13C, which is the case if less CO2 is taken away by vegetation and/or more vegetation decayed.
For the long term: the uptake of CO2 by the biosphere increased over time:http://www.bowdoin.edu/~mbattle/papers_posters_and_talks/BenderGBC2005.pdf

Oceans are not the cause of the increase in the atmosphere either: any substantial increase of CO2 emissions or throughput from the oceans would increase the δ13C level in the atmosphere, but all we see is a firm decrease…

What the WMO are actually claiming is that a decade of heat has gone into the system, but not in the form of atmospheric warming, or sea ice melt (global sea ice has been largely above average for a year and a half now), nor in the top 700 metres of the ocean. That means a decade of heat mysteriously ended up in the only place left that has never been measured accurately – 700m-3000m deep ocean. Or, they have to admit they were wrong. Tough call.

This annoys me for all kinds of reasons, but the lack of intellectual and general integrity actually begins right in the first paragraph of the release.

The fact that CO2 concentrations have passed 400ppm is of absolutely NO “scientific significance” and it does not reinforce anything at all, any more than 399ppm or 398ppm or 400.7ppm or, in due course, 406ppm.

“400” ppm is only interesting if you are the PR executive tasked with spinning the message, and your aim is to create a press release that falsely gives the impression of some threshold having been passed, and the preservation of your own integrity isn’t really an issue…

“Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration (NOAA).”

Wait one moment, I think they have some bad assumptions in the math here. Isn’t the total greenhouse effect for the planet estimated to be about 33C? A 34% increase in greenhouse forcing would mean more than 11C of warming if both of those are correct.

A) Gawds, does anyone think that perhaps we can leave Nick (and for that matter, Mosher) alone and cease the pointless ad hom? Nick is well educated, generally polite, and has a point of view. Some of his contributions and corrections on list are topical and dead on the money — he’s corrected me on several occasions because I was wrong and if anything I appreciate that. Right or wrong transcend point of view. I’m hoping that in return, I might have made him at least think hard about the correctness of some of his statements. We don’t always agree, but we don’t always disagree either, and he is clearly not incompetent a la the Slayer contingent.

B) Curiously, everybody including Nick seems to be missing the real point of this rather incredible statement. Forget the bit about “causing warming” in an era of neutral to falling temperatures — what they “meant” is obvious enough within the prevailing worldview of missing heat going into oceans or being cancelled by soot or aerosols or the screening of solar radiation by dark matter. The concession that this statement makes is worth the price of the gaffe.

The WMO, ex officio just publicly stated, backed by data, that during the indicated decade, water vapor feedback was at most a paltry 15%.

There, see it now? Let’s do the math:

Nominal direct CO_2 driven warming at 600 ppm is supposedly a sloppy 1 to 1.5 C (some 0.5C of which we have supposedly already realized, as this is relative to the baseline/arbitrary “starting concentration/temperature” at 300 ppm).

Feedback from all other sources is 15% of this. That is, the WMO has just publicly acknowledged that — using the larger of the figures for CO_2 driven warming — we have 1.15 degrees Centigrade of warming to fear as CO_2 goes from the current 400 ppm to a presumed 600 ppm by 2100.

Wow. That’s worth a misstatement about “warming”, isn’t it? And note well, they are in fact hoist on their own petard of they attempt to retract or explain the statement. If they say that there has been only 15% feedback because there has been no warming and indeed a bit of actual cooling, they have to explain why there is any feedback at all when temperatures are neutral to cooling. They also have to explain how temperatures remain neutral to cooling when CO_2 forcing has increased by X, water vapor feedback has inexplicably increased along with it in spite of and along with general cooling! They are caught in a double whammy of inconsistency, giving the kiss of death to the prevailing paradigm of strong positive feedback leading to a total climate sensitivity over 2 C.

In a way, this isn’t surprising; it is only jumping out a little bit ahead of where many current papers are going. Total climate sensitivity is in free fall and will remain that way unless/until the climate resumes aggressive warming, increasingly constrained by Bayesian reasoning that recomputes posterior probabilities given data. At the moment the extreme warmist position is trying hard to hold onto 2.3 C TCS, which would likely be only “weakly” catastrophic in agreement with the majority of the beliefs of climate scientists (who, when polled by George Mason some years ago indicated that they thought that there would be non-catastrophic warming by 2100).

Hell, it’s in agreement with my own POV — I think that there will “probably” be non-catastrophic warming by 2100, although my beliefs are not very strong at the moment because I’m studying the (quantum) details of the greenhouse effect in my copious free time and am not impressed with the quantitative aspects of the argument even before feedback is taken into account. Pressure broadening, for example, appears to be a negative feedback in the early (circa 50’s) treatments of the GHE, because it increases absorptivity in the wings, and the atmosphere (at lower pressure) overhead is transparent in the wings, so that increasing ground level pressure should actually increase cooling by basically squeezing the heat out around the edges of the opaque overhead layers. Also, pressure broadening as a heating or cooling modulation of the GHE appears to be a huge, dominant contribution to the dynamic process — air pressure varies by as much as 5%, with 1% variations possible over hours. Clear sunny high pressure weather centers should actually cool faster both because of generally lower humidity and because the leakage from the wings is substantially augmented, minimizing the effect of direct insolation, but of course cloudy low pressure centers with increased GH trapping from both water vapor and a drawback of the wings and consequent increase in opacity is balanced by the formation of high-albedo clouds, which are an even stronger negative feedback everywhere but the poles. Only in the intermediate zone of medium to low pressure and high humidity (but no clouds) is augmentation of warming likely.

Of course, I’m still working on the quantitative aspects of this — the papers I am reading sadly tend to lack numbers if they are on climate science (probably because they always treat whole bands and we cannot explicitly compute the absorptivity of the atmosphere by the bands of any species, we can only do an approximate computation that is little better than an estimation. Interestingly, pressure broadening is very important to the telecommunications industry, though, and they have papers out there with explicit numbers of the modulation of atmospheric opacity in the microwave spectrum right next to the LWIR, that suggest that this effect is very important indeed. Since they are interested in what happens to comparatively sharp lines (carrier frequencies) their work is of greater use and the computations are a bit more sensible.

I notice they say “141% of” rather than saying “a 41% increase”. 100% of is the same as 1 X of, or no increase at all. It is also a 1.41 X increase. A few years ago the CBC mentioned the increase of greenhouse gasses but ignored the ~95% that is due to water vapor, giving their viewership wrong information. The CBC also mentioned renewable energy but somehow left out Hydro (dams) which as far as I remember is renewable.

Ferdinand Engelbeen says:
May 27, 2014 at 12:17 am
…
The alternative theories like these of Salby fail one or more observations and thus are proven wrong.

Ferdinand, I appreciate your comments and have learned much about CO2 from you. In your view, is there a climate theory that stands up to the evidence? Clearly, CO2 driven climate models fail one or more observations and thus are proven wrong.

If Mr Stokes is hoping to derail threads, he is failing. Personally when he states something or disputes something, I am looking for his agenda, and generally disagree on the face of it.
I do believe that this information weakens his POV, and strengthens mine. If observations do not back your position, you hypothesis is wrong….unless of course you are stuck in dogmatic thought.

Wait one moment, I think they have some bad assumptions in the math here. Isn’t the total greenhouse effect for the planet estimated to be about 33C? A 34% increase in greenhouse forcing would mean more than 11C of warming if both of those are correct.
===========
bingo. The 34% number fails the most basic smell test. So either WMO is wrong or the NOAA figures are wrong, or GHG theory is wrong, or any combination of the three.

what is largely ignored in GHG theory is that GHG is a moderator of climate. It reduces the spread between minimum and maximum temperatures. Look at Venus for confirmation. Venus rotates very slowly. days and night are greater than 100 earth days. If we had this on earth the equator would roast in daytime and freeze in nighttime. Yet on Venus there is virtually no difference in day and nighttime temperatures.

We see the same thing on earth. Desert regions have very high swings in temperature between day and night, while rain forests have very little fluctuation in temperature between day and night.

So if anything, increasing GHG should lead to a more moderate climate, which in general terms should be good. There is always exceptions of course, and these exceptions can be cherry picked to show that increasing GHG is negative, but the world as a whole proves this to be wrong.

Over the past 60 years. A time at which we are told GHG is increasing the fastest, has been the time of the single greatest advance in human history. How is this possible if GHG is bad? It makes no sense.

If GHG is bad, then the past 60 years should have been the worst in human history. We should have seen starvation on a massive scale as population increased. Certainly it was predicted. By no less a figure than the current US Chief Science Adviser. So, if he can’t get it right, why should we expect the WMO or NOAA to get it right?

can we leave Nick alone
========
a lie of omission is no less a lie. however in this case it appears more a case of misrepresentation than simple omission. the purpose in calling someone out is to discourage such behavior in future.

Clearly, CO2 driven climate models fail one or more observations and thus are proven wrong.

That is clear for 95% of all climate models… The only few that follow observations have a (very) low sensitivity for 2xCO2.

In the case of models, the prerequisite is that the model reflects past reality. But even if they follow reality, that can be for the wrong reasons as is clear for all current models which could retrofit the past century with a wide range of climate sensitivities but almost all fail the last 1.5 decade…

I’ve never understood the water vapor feedback mechanism. If carbon dioxide can exite the feedback, why can’t water vapor, which is a much more powerful IR gas, excite it and either cause runaway warming or at the least saturate the feedback. In fact it might be slightly better since some of the the water vapor spectral lines are at shorter wavelengths and can (marginally) penetrate water better increasing the heating and evaporation of the oceans.

The only few that follow observations have a (very) low sensitivity for 2xCO2.
================
I’ve not seen this information previously and had assumed that the few that follow observations do so simply by chance.

If indeed there is correlation between model sensitivity and observation (ie: low sensitivity models are most consistent with current observations) I would very much like to see the data. It would make a great article for WUWT.

An increase in CO2 from 300 to 400 ppm is a 33% gain in the concentration of that GHG, but its effect won’t increase by the same amount, since the effect is logarithmic. However, that observation is trivial when compared to the fact that there is on the order of 100 times more of the GHG H2O in the air than CO2. If the global average of H2O be 30,000 ppm (probably higher), then the extra 100 ppm of CO2 means that total GHG concentration has risen by only about 0.33%, not 33%.

I see on the news President Obama announced a new assistance program to train science teachers. Will they train them in real science like, “if the data don’t support you then you’re wrong” or will they train them to “lie and exaggerate”?. Gee I wonder. /sarc

Consider this: Those of us old enough to remember Geology in the 1970’s may also recall Imbrie’s assertions that the Earth was trying to go back into a glacial period, and it was only CO2 that was keeping us from doing that. So maybe it really is getting warmer, and just enough to keep us from getting colder. Neat, eh?

Then if the CO2 reductions really work, we should soon see a vast wall of ice looming on the Norther horizon.

Just in case you did not pick up on it at University, the Pleistocene ice ages really sucked.

if correct this would be strong evidence that the oceans are not warming.

Unfortunately, the contrubution of the ocean surface caused by temperature is too small to be noticed in the CO2 increase: a full 1°C increase over the full ocean surface gives only 17 ppmv extra CO2 at dynamic equilibrium (including less uptake near the poles and more release near the equator until equilibrium). But because CO2 increased over 100 ppmv, the average flux is the other way out: the oceans are a net absorber of ~3.5 GtC/year…

I have no direct information about which model does what, but have used a simple EBM model (energy balance model) which can be manipulated by giving different effect factors to different forcings. You can halve the sensitivity for CO2 if you reduce the sensitivity for human aerosols, which anyway is overblown and in both cases you can fit the past century, but the result in the 21st century is better with a low sensitivity for CO2/aerosols. See:http://www.ferdinand-engelbeen.be/klimaat/oxford.html

I need to update the graphs beyond 2000, but here what happens for the two sensitivities if you plot them to the year 2100:

Robert Brown is correct to point out the problem is not that the increase in CO2 is failing to warm planet (that is limited by physics) but that CO2 is failing to invoke the feedbacks that do the heavy lifting. I’ve no doubt that additional CO2 is creating additional downwelling radiation but it just isn’t enough to warm the planet without the knock-on feedbacks. Maybe Nick has an answer for that, or maybe the incoming radiation is simply diluted below the level of detection in the oceans. Something unexpected is happening with the energy that CO2 is capable of trapping (and skeptics agree CO2 is a greenhouse gas). I’m going with Willis’ emergent phenomena until something better comes along.

The models upon which CACA is based simply assume way more water vapor feedback than is in evidence, & they undervalue the non-radiative effects of more water in the air. If, as seems the case, positive & negative feedbacks roughly cancel out, then climate sensitivity would be just about what doubling CO2 should produce on its own. However there could be effects in the wild that negate even the laboratory measurement of the warming effect, at least in some environments under some conditions.

The feedback-dependent models don’t work well & most of them fail miserably.

This annoys me for all kinds of reasons, but the lack of intellectual and general integrity actually begins right in the first paragraph of the release.

The fact that CO2 concentrations have passed 400ppm is of absolutely NO “scientific significance” and it does not reinforce anything at all, any more than 399ppm or 398ppm or 400.7ppm or, in due course, 406ppm.

“400″ ppm is only interesting if you are the PR executive tasked with spinning the message, and your aim is to create a press release that falsely gives the impression of some threshold having been passed, and the preservation of your own integrity isn’t really an issue…
—————————————-
Very well said. That’s where my BS alarm went off too – right at the start.

Of course I didn’t expect the WMO to have any other position – they are firmly part of the establishment, and as mother nature continues to not cooperate with their AGW conjecture (the establishment’s) the Orwellian language will just continue to get, well, more Orwellian.

The establishment is conducting the conditioning and indoctrination of the masses on a scale not previously possible or imaginable. Things have come a long way since Bernais’ “achievements” with the marketing of tobacco and deadly toxic waste in drinking water. Now they can get people to willingly “do with less” to reduce their “carbon footprint”. They have actually managed to get people to willingly hand over their money to them, to fix an imaginary problem. Genius.

In the UK the BBC keeps running articles about how insects can replace protien in our diet. Just now and again though, these articles. I think it’s just to get you used to the idea. I think they call it “nudge theory” (a PR technique).

As other commenters have pointed out, once the stuff is printed/shown on the telly/broadcast on the radio/put out on the website and the masses have consumed it, the damage is done. It’s precisely because it’s done by the establishment that it is so successful – people trust the World Meteorological Organisation because it sure sounds official. People trust the BBC (Auntie Beeb), etc.. People trust the establishment, generally. I don’t, personally.

And as for NS – can you not get yourself a proper job, something that might allow you to gain a sense of, you know, self respect, purpose in life, instead of being a tool of the establishment?

Given that CO2 absorbs only at certain wavelengths and the amount of energy available at those wave lengths is finite, how much solar energy is available to be absorbed by additional atmospheric CO2? Isn’t the absorbed-energy-vs-CO2-concentration curve getting pretty flat?

So if we take one stere of STP air, that is 1,000 litres of air; which contains 1,000 / 22.4 mols of air, or 44.643 mols of air.

So this is 44.643 x 6.023 E+23 air molecules, or 2.68884 E+25 air molecules.

So a sample of 2500 air molecules would be about 9.3 E-23 steres.

Cube root of that gives me 45.3 nm for the side of a cube of air with 2500 molecules in it.

Now just remember that the semiconductor industry passed the 50 nm critical dimension point in their chip technology a long time ago. I can’t recall if they are at 25 nm yet, or not, but that would be around 300 molecules of air.

So I have 1974 + 500 + 25 + 1 molecules in my 45 nm sample. I shaved it a bit so as to leave the CO2 molecule out of it; just outside my sample. (but still there).

No I don’t have any kind of microscope that I can see my sample with, but Mother Gaia; my super Maxwell’s demon, sure does, and she can see 2499 molecules dancing around crashing into each other at thermal energies. 2499 is a big enough number to declare that my sample has a recognizable Temperature; with a roughly Maxwell-Boltzmann kinetic energy distribution, and an equi-partition energy of about (3/2)kT Joules per molecule. Well maybe the N2 and O2 molecules need about (5/2)kT energy, if you give them two rotating dumbbell moments of inertia. IANACh, so this is all Alchemy to me. But it is something like that.

And since this is PCh, my alchemybrew sample does NOT radiate any electro-magnetic radiation, and certainly not in the LWIR region.

Well those rotating dumbells maybe broadcast at microwave frequencies. I think Dr. Roy and Prof Christy tune to that station.

Every once in a while, that absent CO2 molecule gets kicked into the field of view, and MG gets a shot in the eye, of 15 micron LWIR photons; but for most of the time, her entire field of view, really has no photons coming from it, from my air sample. The CO2, out of the field of view, is of course singing like a canary, at 15 microns, but that isotropic broadcast never gets into MG’s microscope with its narrow depth of field “macro” lens.

So compared to the one CO2 molecule, my air sample is huge, and is emitting no LWIR EM radiation; nor is it absorbing any.

Well, we non Ch’s know full well, that those quiet molecules, in collision, do in fact have an off and on radiative antenna, that is active during those interminably long times, it takes for the molecules to crash into each other, and then get out of each other’s way. Those collision induced radiative antennas really do emit LWIR EM radiation, but it is in a broad continuum thermal spectrum, peaking at about 10 microns wavelength; but relatively weak on a per molecule basis.

So if you view the atmosphere as a large collection of samples of 2500 air molecules, each having just one CO2 molecule just outside it, you can appreciate that most of the atmosphere is quite transparent to LWIR radiation, and only a small part is causing all of the ruckus.

Of course, I played the same trick, that the Ch’s do, and decreed, that my air samples are of dry air, because as we know, H2O is not a permanent part of the atmosphere, but just comes and goes .

One important concept to grasp.

The CO2 in absentia, is absorbing and emitting narrow line spectra EM radiant energy.

What my MG is looking at in her microscope, is 100% ……””””” HEAT ENERGY “”””…..consisting entirely of the mechanical kinetic energy of colliding molecules. Essentially NO electro-magnetic processes are involved (well apart from the collision debris which Ch’s don’t admit to.)

Antarctica ice which is, and has been for some time, at record extent levels is a thorn in the side of the high priests of Thermageddon. Somewhat like the Medieval Warm Period that they wanted to exterminate at one time. So it’s not surprising there is currently an all-out attack on this inconvenient intrusion in the “climate change/global warming” dialogue. Their approach is simply to ignore the temperature graphs (what person in the general public reads them or even understands them) and to issue from a recognised “authority” a proclamation of doom.
Yep, Foxy Loxy at work again. Watch the attached video that Disney was asked to make during the dark days of WWII. It’s .relevance now is surprisingly accurate.

I leave it to you who might fill the character of Cocky Locky and the place called the “Cave”.
If you don’t have time to view the video here are Foxy Loxy’s strategies from his Psychology Book:
1. To influence the masses aim at the least intelligent (i.e. the Chicken Littles)
2. If your gonna tell a lie don’t tell a little one – Tell a BIG one
3. Undermine the fate of the masses and their leaders (i.e. Cocky Locky)
4. By the use of flattery (I hope I spelled that right!) insignificant people can be made to look upon themselves as born leaders

@ David Ball, “34% increase in radiative forcing” stated in the beginning of article uses what as a baseline? How was the 85% of global warming calculated? I am no mathematician, but something about calculating and quantifying %of radiative forcing attributed to CO2 increases just feels fuzzy. I am also thinking that through the month’s now reading articles and comments here I see evolution of thought largely due to sharing of vast amounts of information which often makes my head spin. I think you forced a valid question in the other thread (sorry I am new to both this site and my new computer, cut and pasting challenged here) which as I remember it, defining how much an increase is 300ppm to 400ppm CO2 in the atmosphere? I kinda would like to know that one….curious

George, while a lot of water is formed from fossil fuels (and cooling towers, including nuclear), that is not important: the average decay rate of any excess water vapor is only a few days. For CO2 it is ~50 years.

Further, I suppose that the 34% increase only is the extra increase of the absorption in the CO2 band, where water is not active. Thus it is extra, but not 34% of the total IR absorption. But even this tiny amount of CO2 does absorb a measurable amount of energy in its 15 micron band if you make the sample a little lager than 45 nm, some 70 km or more to be sure. That is what satellites measure:http://www.sp.ph.ic.ac.uk/news/newsmar01.html
There is a more recent difference plot, but I haven’t found back…

And how transparant is a glass window if you replace 1:2500 glass atoms/molecules by silver atoms?

Thank you for your response. Milodonharlani has a great post in that thread and this thread. When Co2 is given as a percentile in the atmosphere, it is o.o4% or 4 ten thousandths. So an increase from 3 ten thousandths to 4 ten thousandths would accurately described as 1 ten thousandths, or 0.01%.

The 34% claim is using 400 (ppm) as a base. An increase os 300 to 400 would be ~33.3%. This is what I describe as unadulterated bulls**t. I am not sure how many “Hiroshimas” this is.

Since CO2 global warming theory includes CO2 causing an increasing % of water vapor in the atmosphere (and I can’t speak right now as to whether that was relative or absolute since I just got home from work), it shouldn’t be the case that CO2 alone is calculated. It should be calculated along with whatever the increase or decrease in water vapor is. If there has been a net negative increase in GHG, that could explain the pause and wipe out the CO2 theory in one fell swoop. Why? The increasing CO2 MUST cause water vapor, a very potent greenhouse gas, to also increase.

“Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration (NOAA).”
This has nothing to do with CO2 increasing 34%. This is the claimed increase in greenhouse radiative forcing, the amount of energy the greenhouse effect is pushing back down to warm the Earth. And it is talking about a 23 year period, no from preindustrial times so it has no connection to the 300ppm increase to 400ppm in CO2.

So this article is claiming the greenhouse gas radiative forcing went from 333 wm2 and increased by 113.22 wm2 since 1990. Now in theory a doubling of CO2 would increase radiative forcing by 3.7 wm2. So where did NOAA’s extra 109.5 wm2 come from? So that is about the same as 30.6 doublings of CO2, so with a climate sensitivity of 3.4 C per doubling from the IPCC model mean the Earth should have warmed 104.04C since 1990. Looks like it is worse than we thought!

@David Ball and Theodore.. You both answered my query! I just could [not] see CO2 increasing in the atmosphere by 33.3 % when the metric measuring is 300ppm to 400pmm. Thank you David! And my unwritten thought was exactly about how Radiative Forcing was calculated relative to increased CO2 measured in ppm?? Thank you Theodore!

George, while a lot of water is formed from fossil fuels (and cooling towers, including nuclear), that is not important: the average decay rate of any excess water vapor is only a few days. For CO2 it is ~50 years. …..”””””

Well Ferdinand, I deliberately left out the water, and other polar molecules to point out that a lot of the atmosphere is quite transparent to the CO2 molecular spectral lines. And your 70 km of air, consists of a large number of 45 nm samples, each of which has a CO2 molecule just outside it. The total scale doesn’t matter.

And NO, I am not one of those “how can 400 ppm of anything have an effect?” people. I never deny the CO2 or other GHG influences. But I’m glad you mentioned that “excess water” in the atmosphere can decay quickly.

Just how much water vapor is “excess”, Ferdinand ??

You just explained how rapid variations in the water vapor quickly counteract the much slower changes in CO 2

In a nutshell one can simply say that the range of Temperatures and other climate conditions, that earth enjoys, and has enjoyed for eons, are simply a consequence of the various properties of the H2O molecule. Notable among those properties, is the roughly 104 degree angle of the H2O molecule. Life wouldn’t exist without it.

And your 50 year lifetime of CO2 excess is nonsense.

Every year the atmospheric CO2 “excess” at the north pole and surrounds, drops 18-20 ppm in just five months, due to the natural removal processes. At that rate, of decline, the claimed 120 ppm excess over the “pre-industrial” comfort value, is all removed in 25 months, about two years. So following an exponential decay, 99 % of the excess would be gone in 125 months; ten years and five months; not 50 years..

If our oceans were methyl alcohol, instead of water, the earth’s comfort range would be different. I have no idea whether life could evolve in such a Goldilocks state; or what sort of life it might become.

The statisticians can masticate their R values all they like; but unless they can change the water molecule angle to some value different from 104 degrees, they won’t ever change earth’s comfort range.

I’m not questioning that there has been no net warming, as your data clearly shows, Anthony, but could there still have been an increase in radiative forcing, offset elsewhere, that they were talking about?

And your 70 km of air, consists of a large number of 45 nm samples, each of which has a CO2 molecule just outside it. The total scale doesn’t matter.

Sorry, but the scale does matter: the CO2 molecules are not lined up against each other, but randomly distributed together with the 45 nm samples. Thus the probablility of a 15 nm IR wave to hit a CO2 molecule is very high near ground, but less and less with reducing air density.

Just how much water vapor is “excess”

Every molecule above the maximum humidity at the temperature and pressure of any point in the atmosphere… Where it is emitted, it may get into the atmosphere, but when that reaches colder places – a few hundred meters from a cooling tower may be sufficient to drop out as condens/rain.

And your 50 year lifetime of CO2 excess is nonsense.

You make the classic mistake of looking at the residence time of CO2: that is how much CO2 of the total amount in the atmosphere is exchanged with CO2 from other reservoirs. That is ~150 GtC of the current 800 GtC or a residence time of ~5.3 years. But that is only exchange and doesn’t remove any CO2 from the atmosphere as long as ins and outs are equal over the full (seasonal) cycle.
But the ins and outs aren’t equal: there is a net uptake of ~4.5 GtC/year of CO2. That is the result of the extra pressure from 110 ppmv (234 GtC) above equilibrium which pushes more CO2 into the oceans and vegetation. The e-fold decay rate of a linear dynamic equilibrium process can be calculated:
234 GtC / 4.5 GtC/year = ~51 year. That is a half life time of ~40 years.
Far longer than the residence time, far shorter than the IPCC’s Bern model, which is based on the saturation of the deep oceans, for which is currently not the slightest sign.
See also a similar calculation by Peter Dietze at the late John Daly’s website:http://www.john-daly.com/carbon.htm Peter Dietze
That is already from 1997, but he did find a similar decay rate as today…

Wow! Does anyone else see the really, really interesting feature in this data?

without the knock-on feedbacks. Maybe Nick has an answer for that, or maybe the incoming radiation is simply diluted below the level of detection in the oceans.

The heat content data look like a slam dunk for this, actually. Note well — the ocean warmed in the first third (but the atmosphere did not, if anything it slightly cooled). Then, while the temperatures of the atmosphere were “rocketing” up (if increasing by 0.2 to 0.3 C over 15 years can be called rocketing) the ocean nearly stopped warming! Note that the heat increase is at the level of noise, at the error/resolution limit. Then atmospheric warming paused (or “hiatused” a la AR5:-) and the ocean resumed warming. If it weren’t for the reason below that I find the data set itself deeply suspect, I’d say that this is very strong evidence that a) the ocean buffers atmospheric temperatures; b) the rate at which the ocean warms countervaries with atmospheric temperature. This is highly counterintuitive — one rather expects a warmer atmosphere to produce warming oceans, but in fact it atmosphere that is not warming is well-correlated with ocean warming; c) It isn’t clear if a) and b) should refer to the rate at which the atmosphere warms or its temperature. Curiously, the data suggest that it should be correlation of rates but that makes very, very little sense. The atmosphere has almost no heat capacity relative to the ocean, remember, and it is difficult to understand how it can retain heat for times that are long compared to its thermal relaxation times (that are all order hours to days — without sunlight the atmosphere would almost immediately cool to very cold indeed in a matter of a week or two) without the help of the ocean.

This is really a substantial puzzle in the non-Markovian dynamic evolution of the heat content of the Earth.

[As an aside, note that one unbelievable thing about this data set is the error column and its explicit assertion that we were measuring oceanic heat content in the 1960s at the same precision that we measure it today with ARGO, but that’s another story, as always in climate science when considering statistical error AFAICT the error bars are all of the “and then a miracle happened and ARGO buoys — or thermometers, or proxies, or tide gauges — were transported back to the 1960s (or what the heck, the 1860s) so that the error estimate then is about the same is it is today”. I don’t know that it makes me doubt the whole thing, but it makes me wonder how the hell they compute both the numbers presented and their error.]

The saddest part about things like this piece and those that defend it with their posts is what do they REALLY think is in it for them? Do they REALLY think that they will be part of the 10% of the population the elite will show favor to by keeping alive when the preparations for “global warming” have been completed – tear down of the energy system and reduction in arrable land dedicated to food production – and we are faced with the reality of a little ice age or worse? Useful idiots they may be at this time, but are they the constructive, productive portion that will be retained to serve as the servants to the privileged? I doubt it. And I truly have a difficult time believing that they can actually BELIEVE what they are saying, either. They would have to be “blind with eyes wide open.”

No, we’re not. I’m perfectly happy to attack “you” for misusing standard, well-defined mathematical terminology on this thread as well. And the silly thing is, you could avoid making the error a second time so very easily. You are absolutely welcome to call the increase in atmospheric concentration of CO_2 since maybe 1950 “0.01%” (added to 0.03%). Just do not say “the percent increase in atmospheric CO_2 is 0.01%”, because that is not the standard meaning of percent increase, as I pointed out with a round dozen or more examples drawn from math, science, medicine, and information websites, most associated with things like Universities and math departments.

High school math. A change in concentration is not referred to as “percent increase”. The latter is a relative term. The former is an absolute term. Everybody knows this and understands this. The only one inventing a political language here is you, and you don’t need to to make your point.

I’ve got to agree with Nick Stokes. There is nothing wrong with the NOAA statement. The fact that we haven’t seen warming doesn’t mean there hasn’t been a “warming effect”. As sceptics, we might be able to argue that this provides evidence that sensitivity to ghg forcing is not as high as that claimed by the CAGWers but we can’t claim that ghg forcing has not increased.

What this shows is that different combinations of parameter yield (near) identical past but very different futures. Thus, one cannot rely on hind-casting as a measure of forecasting skill.

What seems most remarkable to me is that the climate science community seems blissfully unaware of this problem. It is well known in other disciplines. Certainly it is well known in computer science.

Your average 5 year old would not be fooled by this. But somehow university educated Climate PhD’s don’t get it.

What I find interesting is that the models are telling us that the same past can yield a range of very different futures. This tells me that natural variability is not “noise” as it is commonly assumed in climate science, and thus cannot be treated as such mathematically.

interesting result. What happens if you reduce CO2 and aerosol sensitivity even further?

Below 1.5°C/2xCO2, the curves of the past divert more and more. But still, the main trends are the same with of course less and less future warming. The whole exercise is with only the four main “forcings” without any feedbacks. With the help of feedbacks like water vapor/clouds the whole thing on a spreadsheet probably may show even less sensitivity for CO2. Or introducing the remarkable PDO-temperature correlation and other natural variability…
One of the ironic points is that such a simple spreadsheet has the same or even better results for the retrofit (thus for the future?) than the multi-million dollar GCM’s running on supercomputers.
See: http://www.economics.rpi.edu/workingpapers/rpi0411.pdf

the climate science community seems blissfully unaware of this problem.

And your 70 km of air, consists of a large number of 45 nm samples, each of which has a CO2 molecule just outside it. The total scale doesn’t matter.

Sorry, but the scale does matter: the CO2 molecules are not lined up against each other, but randomly distributed together with the 45 nm samples. Thus the probablility of a 15 nm IR wave to hit a CO2 molecule is very high near ground, but less and less with reducing air density. …..”””””

Well Ferdinand, I have never argued against what you just said; well I’ll give you your 15 nm typo; we all know you meant micrometers..

All I have simply argued, is that each CO2 molecule, which gladly intercepts LWIR photons, in narrow spectral lines generally around 15 micron wavelength, whether it’s at low or high altitude, is accompanied (on average) by 2499 other quite benign molecules, which don’t, and which are quite happy to transmit, both those 15 micron lines, as well as virtually all of the other wavelengths of radiation, running around in the atmosphere.

The absorption of those 15 micron photons, apparently sets the CO2 molecule off in a bending mode of (purely internal) oscillation; which it would seem does not change the global kinetic energy of the CO2 molecule, and hence doesn’t change its Temperature. Ultimately, that internal energy should be re-emitted, possibly prematurely as a result of a collision with some other molecule, and I would presume that some frequency shift occurs as a result.

So how big do you think a 15 micron wavelength photon is; or does it have a size at all ??

If you are looking for some claim from me, that CO2 or other GHG does not, or cannot heat the atmosphere, you might as well forget it. I’ve never held or expressed any such view, and have strongly warned against such claims.

Lots of things effect the temperatures and energy transport in the atmosphere; the GHG effect being just one of those.

But ultimately, it is the set of processes that result from the physical properties of the H2O molecule, that establish, and regulate, the range of temperatures found on the earth.

When a government raises taxes, to get more revenue; taxpayers stop doing, or reduce their taxable activities, to counteract the effect on them.

Planet earth’s environment acts on the exact same principles, and it will set its own agenda, based on the properties of the materials in its environment; H2O being, by far the most influential material.

The sun’s energy powers the system; the physical properties of water, establish to comfortable range (of the system), and all these other extranea, have very little effect.

I deliberately asked you how much water in the atmosphere is excess. There IS no excess; there’s a vast supply of water available in the system, to be moved around, in whatever way is needed, to establish the stable conditions. If you introduce a perturbation, such as burning a hydrocarbon to add CO2 and H2O to the system (not just the atmosphere; the feedbacks take over and readjust whatever needs adjusting to counteract the perturbation, and that applies to the injected CO2 as well as the injected water, or anything else.

Mercury and Mars, have no water oceans, and so they are at the mercy of the sun’s fluctuations.

The absorption of those 15 micron photons, apparently sets the CO2 molecule off in a bending mode of (purely internal) oscillation; which it would seem does not change the global kinetic energy of the CO2 molecule, and hence doesn’t change its Temperature. Ultimately, that internal energy should be re-emitted, possibly prematurely as a result of a collision with some other molecule, and I would presume that some frequency shift occurs as a result.

So how big do you think a 15 micron wavelength photon is; or does it have a size at all ??

This isn’t the right question. The question is, “What is the absorption cross-section for a 15 micron photon”. That’s the effective surface area intercepted by each CO_2 molecule. It is large enough that the mean free path of LWIR photons in the pressure-broadened absorption bands of CO_2 in the lower atmosphere is order of a meter. That means that LWIR photons — whatever their “size” — with frequencies in the band go no more than a meter or few before they are absorbed by a CO_2 molecule.

The lifetime of the excited state(s) is much longer than the mean free time between molecular collisions between the CO_2 molecule and the (usually nitrogen or oxygen or argon) other molecules in the surrounding gas. That means that the radiative energy absorbed by the molecule is almost never resonantly re-emitted, it is transferred to the surrounding gas, warming not just the CO_2 but the oxygen, nitrogen, water vapor, argon as well as the other CO_2 molecules around. Periodically CO_2 is thermally excited in-band by just such a collision and radiates energy away, but it is not like an elastic scattering process such as occurs in specular reflection within clouds. In band/thermal radiative energy gradually diffuses upwards, with the mean free path of the photons increasing the higher one goes, until it starts to equal the remaining depth of the atmosphere and photons emitted “up” have a good chance of escaping, cooling the molecules (on average) that emit them. It takes order of 100s of absorptions and emissions for radiation to diffuse upward to escape, and there is an almost equal probability that radiation will diffuse downward (especially from the lower levels) where we observe it as back-radiation/greenhouse radiative forcing of the surface.

Even this is oversimplified. Because of pressure broadening, molecules close to the ground emit photons “in the wings” at frequencies that less broadened molecules at higher altitudes/lower pressures are nearly transparent to. That means that there is a steady CO_2-mediated “leakage” even from lower altitudes directly to space from the edges of the monotonically decreasing-with-height absorptive bandwidth. It also means that there is a MAJOR change in atmospheric absorptivity/emissivity with simple high and low pressure centers as they move around, as well as a modulation of the size of the emission-wing “hole”.

Janice Moore says on May 26, 2014 at 9:00 pm: “[CO2] was responsible for 85% of {nothing}.”

You are absolutely correct. The explanation comes from the fact that Arrhenius greenhouse theory they use is invalid. It should be clear to anyone who pays attention that if a theory predicts warming and nothing happens for 17 years that theory is wrong and must be discarded. The only greenhouse theory that correctly explains the “pause” is the Miskolczi greenhouse theory (MGT). That is because it is capable of handling the general case where more than one greenhouse gas simultaneously absorbs in the IR. In such a case the gases present establish jointly an optimal absorption window that they control. For earth atmosphere the gases that count are carbon dioxide and water vapor. Their joint optimal absorption window has an optical thickness in the IR of 1,87, determined by Miskolczi from first principles. If you now add carbon dioxide to the atmosphere it will start to absorb, just as Arrhenius says. But this will increase the optical thickness of the absorption window. And as soon as this happens, water vapor will start to diminish, rain out, and the optical thickness is restored to its original value. That is exactly why we have the pause where carbon dioxide increases but does not cause warming. This is actually not the first time that this has happened. Last time it was in the eighties and nineties, just before the super El Nino of 1998 arrived. It lasted for 18 years and may have continued if the super El Nino had not changed everything. Problem with this particular standstill is with falsified temperature curves that show its temperature sloping up to create false warming. You can easily tell that they are falsified by comparing them to satellite data which they still do not control. An older example comes from NOAA weather balloon database that goes back to 1948. Miskolczi used it to study the absorption of IR by the atmosphere over time and found that absorption was constant for 61 years while carbon dioxide increased by 21.6 percent at the same time. This is an exact parallel to what is happening with the pause today. And what this means is that sensitivity is effectively zero because doubling carbon dioxide will not cause any warming. And that poor AGW that you were concerned with? It simply does not exist, sorry about that. It is nothing more than a pseudo-scientific fantasy, built up by the IPCC on the assumption that Hansen observed the greenhouse effect in 1988. It turns out that Hansen did not observe any such thing in 1988. Checking the Congressional Record I find that he claims to have observed a hundred year warming that can be nothing but greenhouse warming. But all is not well with his century of warming. It turns out that he uses ground-only temperatures that show warming between 1880 and 1910. Had he used the Land-Ocean version of GISTEMP available to him it would have shown cooling, not warming in this time slot. But it gets worse. He also includes a non-greenhouse warming that starts in 1910 and stops in 1940 as part of his 100 year greenhouse warming. It appears that we must remove everything before 1940 from his greenhouse century if we want to have a greenhouse curve. And what is left of it is a wiggly segment, 25 years of cooling followed by 23 years of warming. No way can this remnant curve be used to prove the existence of the greenhouse effect. But nobody checked his work and he has gotten away with it for the last 26 years. And a massive IPCC organization built up upon the assumption that the greenhouse effect exists is now taking over the world.

I made that simple arithmetic calculation out loud while watching Bill Nye the Anti-Science Guy, who cited a 30% increase in CO2, & Marsha Blackburn on TV with friends. I was disappointed that Rep. Blackburn didn’t make the same point. I then added the somewhat harder to explain factors of logarithmic response & overlapping absorption bands to show how negligible the gain from 280 to 400 ppm is in effect.

You were rightly criticized there for your inability to do high school math, namely your assertion that an increase in CO2 from 300ppm to 400ppm was not an increase in CO2 of 33%. Either that or you could do the math and were trying to mislead others?

Yet here we are on this thread all agreeing that it is Bulls**t.

No, what is being criticized by Anthony here is the statement that “Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases,” because it hasn’t led to an increase in global average temperature. If that was what you were trying to say you expressed it extremely poorly.

Dang it, late to the game again as I was in training all day but thought I might throw this potential gem at y’all. On the dry erase board adjacent to my cubicle here at work, I (at least monthly) put up a “Thought of the Month”, usually themed around a quotation or other sciencey thingamabob I found interesting at the time, such as one of my favorites from H. L. Mencken; “The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary.”
Friggin’ classic, what? This quote sat in my “Thought of the Month” box on the dry erase board for about two months or more over the Christian holidays, just because I liked it so much and hey – I control the board. And, as I am considered an “Old Sage” here at the job, many people come by to sit and consult (read: Shoot the $%!+) – and they can’t help but see the board (among other choice Science Realist stuff I have taped up there). Quite the conversation starter it was.
But I digress! Right now here is what fills the space on the board:
“Science Thought of the Month”
Total Earth Atmosphere = 100.00% (All Gases)
CO2 = 000.04% (now +/-400 ppm)
Delta or change in CO2 since onset of “Industrial Age” = 000.012% (+/-120 ppm)
Or, a change in total “average” atmospheric CO2 ppm from around 280ppm in the mid-to- late 1800’s up to the current 400 ppm.
Such a small number as to be difficult for the average Joe (or Jane) to fathom. As it should be. Can such a small total change matter? Yes, but on a small scale. Can this small change lead to catastrophic, run-away, never to return to normalcy “GLOBAL WARMING” ? (shout intended, sorry, should have warned you so you could cover your ears) I’ll let you decide.
Regards,
Michael C. Roberts

“This should serve as yet another wakeup call about the constantly rising levels of greenhouse gases which are driving climate change. If we are to preserve our planet for future generations, we need urgent action to curb new emissions of these heat trapping gases,” said WMO Secretary-General Michel Jarraud. “Time is running out.”

Preserve our planet for future generations?!? Oh, come on now, get a bloody grip. Conserve, I’ll give you, but that would make no sense in this context, nor is it a simple slip of the tongue. Total scare-mongering, trough-snouting, agenda-driving horse manure.

rgb, thanks for the help and education. I know some basics of absorption etc., but not that many details. The only direct confrontation I had with absorption rates was for Cl2 monitoring in the production end of the HCl plant where H2 and Cl2 were burned together to form HCl with an oversupply of H2…

Dr Brown says: “The lifetime of the excited state(s) is much longer than the mean free time between molecular collisions between the CO_2 molecule and the (usually nitrogen or oxygen or argon) other molecules in the surrounding gas. That means that the radiative energy absorbed by the molecule is almost never resonantly re-emitted, it is transferred to the surrounding gas, warming not just the CO_2 but the oxygen, nitrogen, water vapor, argon as well as the other CO_2 molecules around. Periodically CO_2 is thermally excited in-band by just such a collision and radiates energy away”.

There seems to be a contradiction here, first you say the lifetime of the excited state is so long that a CO2 molecule passes its energy to surrounding molecules and its practically never re-emitted. Then at the end of the quote you say that when a CO2 molecule is excited it “radiates energy away” and this happens “periodically”. So which is correct, “almost never” or “periodically”? If a CO2 molecule is excited by whatever means (radiation directly from surface or through contact with another GHG molecule) surely the long period of excitation means its mostly not re-radiated and instead passed on to other molecules?

The absorption of those 15 micron photons, apparently sets the CO2 molecule off in a bending mode of (purely internal) oscillation; which it would seem does not change the global kinetic energy of the CO2 molecule, and hence doesn’t change its Temperature. Ultimately, that internal energy should be re-emitted, possibly prematurely as a result of a collision with some other molecule, and I would presume that some frequency shift occurs as a result.

So how big do you think a 15 micron wavelength photon is; or does it have a size at all ??

This isn’t the right question. The question is, “What is the absorption cross-section for a 15 micron photon”. That’s the effective surface area intercepted by each CO_2 molecule. It is large enough that the mean free path of LWIR photons in the pressure-broadened absorption bands of CO_2 in the lower atmosphere is order of a meter. That means that LWIR photons — whatever their “size” — with frequencies in the band go no more than a meter or few before they are absorbed by a CO_2 molecule……”””””

Well I am not in any disagreement with the explanation that RGB gives here or elsewhere. He in fact makes a point which I could not make as he does, in that I can’t get my mind around the quantum mechanical picture of this situation, since my own knowledge of QM stopped well short of such understanding. I have always assumed that Phil too is fluent in this field.

The whole point of my 45 nm sample of “air” was in fact to demonstrate that a “classical” picture of the atmospheric gas at a small scale, in which a single CO2 molecule itself was a tiny part of a gas sample that clearly is big enough (at 2500 molecules) to demonstrate the basic properties of a real gas, in terms of MB statistics; yet the total sample is itself tiny compared to the wavelength of the EM wave associated with the “quantum”, that just the even smaller CO2 molecule is supposed to swallow.

So my question, “How big is a 15 micron photon anyway” was really tongue in cheek; but asked to point out the weird QM picture of reality, in that the supposed “wavelength” of the EM wave associated with an approximately 87 meV photon, is huge compared to the 45nm air sample which itself is huge compared to that single CO2 molecule, which we all believe can and will gobble up that photon.

Usually, in “particle” physics, we think of sub atomic sized “things” hitting a similarly small target (crossection) and interacting in the event that target size is hit. That’s much closer to our common experience of “target shooting”, but the photon hitting to molecule all seems completely backwards, and not at all similar to our street experience.

A similar “absurdity” image would be an alternative description of meteor showers; those flashes and streaks of light in the sky.

We could describe these common events, as simply the earth “landing on” other heavenly bodies; they just happen to be the size of dust particles, and get “squished” in the process. Yes a weird view, but actually a quite accurate one. Who is to say, which is the lander, and which the landee ?

Or we could say, that you can weigh the entire earth on any ordinary bathroom scale, by simply putting the earth “on” the bathroom scale. Well the earth is “down there”, so of course you have to turn the bathroom scale, upside down, and then put the earth on it. When I do it, I get about two pounds for the “weight” of the earth. Gravity on my bathroom scale is rather small. I can increase it, by putting the scale on top of my shoe soles, before putting the earth on it, and with that much greater mass and gravity, the weight of the earth increases up to about 180 pounds.

Again, a weird, but quite real view of reality.

RGB says the photon travels a meter before getting captured. I take that as true since I have no idea how one computes that, but it suggests to me, that means it must pass through a vast number of 45 nm air samples and CO2 molecules, before it scores a hit on the target.

That’s a picture quite foreign to my daily experience, but if you can see it as just a ho-hum ordinary event, then your imagination is much wilder than mine. To me that is weird. What QM tells us is reality, is also weird. That is not disputing it; just suggesting that truth really IS stranger than fiction.

I did notice that Robert said pressure broadening, but didn’t mention Temperature / Doppler broadening. I don’t have a good mental picture of the relative importance of the collision, versus Doppler broadening components, at these energy levels.

The wave / particle duality of EM radiation, forces us to confront seemingly absurd pictures of (evidently “large”) photons “landing on” a tiny molecule, and being absorbed by it.

The fact that much of reality is “weird”, does not dispute the accuracy of it.

Even Einstein thought QM was weird, even as he was constructing a solid case for its reality.

“””””…..from RGB…..Even this is oversimplified. Because of pressure broadening, molecules close to the ground emit photons “in the wings” at frequencies that less broadened molecules at higher altitudes/lower pressures are nearly transparent to. That means that there is a steady CO_2-mediated “leakage” even from lower altitudes directly to space from the edges of the monotonically decreasing-with-height absorptive bandwidth. It also means that there is a MAJOR change in atmospheric absorptivity/emissivity with simple high and low pressure centers as they move around, as well as a modulation of the size of the emission-wing “hole”……”””””

I have many times described this exact process, as it relates to the isotropic emission of LWIR radiation at any layer in the atmosphere.

We expect that half of such radiation goes up, and half goes down. BUT, the upward radiation encounters a COLDER and LESS DENSE higher atmosphere layer, with NARROWER spectral absorption lines; while the downward radiation encounters a HOTTER and DENSER lower atmosphere with BROADER spectral absorption lines.

So this favors the escape route to space, over the downward return to the surface, where re-absorption becomes increasingly probable. NO it doesn’t stop the downward radiation from reaching the surface; but each absorption and re-emission event launches a new isotropic 50-50 split.

This argues that atmospheric warming, whether by escaping LWIR radiation, or incident solar spectrum shorter wave radiation absorbed by GHG such as H2O, O3 or CO2, must result in more than 50% escaping to space, and less than 50% eventually returning to the surface (as “downward” or “back” radiation), which of course some people try to argue “doesn’t exist” or is “impossible”. It happens.

There seems to be a contradiction here, first you say the lifetime of the excited state is so long that a CO2 molecule passes its energy to surrounding molecules and its practically never re-emitted. Then at the end of the quote you say that when a CO2 molecule is excited it “radiates energy away” and this happens “periodically”.

As far as I remember, the absorbing and emitting CO2 molecule are in general not the same molecule. When a 15 micron wavelength photon is absorbed most CO2 molecules hit another molecule (of any kind) and redistribute their energy before the molecule is emitting the 15 micron photon again. What we call temperature is the average vibrational energy of all present molecules: that is some contain more vibrational energy than others. If one of the high energy O2 or N2 or… hits a CO2 molecule, the redistributed energy can be high enough to get the CO2 molecule to emit a 15 micron photon…

“If one of the high energy O2 or N2 or… hits a CO2 molecule, the redistributed energy can be high enough to get the CO2 molecule to emit a 15 micron photon…”

Thanks for your response. However, if:

“The lifetime of the excited state(s) is much longer than the mean free time between molecular collisions between the CO_2 molecule and the (usually nitrogen or oxygen or argon) other molecules in the surrounding gas”

Then surely that means when the high energy O2 or N2 molecule hits the CO2 molecule in your example, the energy goes into inducing an excited state in the CO2 molecule. Because this state is longer than the mean free time between collisions, then once more the CO2 molecule does not emit a 15 micron photon. The energy just gets passed on again to another molecule which it collides with.

So CO2 will almost never emit a photon unless the atmospheric pressure is such that the mean free time between collisions is longer than the lifetime of the excited state (i.e. higher up in the atmosphere).

The lack of warming in the cited time period has nothing to do with a change of atmospheric CO2 being ineffective at changing global temperature. It has to do with the Atlantic Multidecadal Oscillation peaking out early in the cited time period, and mostly declining during this period.

This is related to most advocates of the existence of manmade global warming exaggerating it. They like to blame CO2 increase alone for the increase in global temperature from the early 1970s to the 2004-2005 peak of smoothed global temperature, while a significant part of that increase was from the Atlantic Multidecadal Oscillation and the Pacific Decadal Oscillation being in their warming phases.

“This indicates that the radiative forcing variables used to simulate the GCM have explanatory
information about observed surface temperature that is not present in the GCM simulation for
global surface temperature”

“None of the GCM’s have explanatory power for observed temperature additional to that provided by the radiative forcing variables that are used to simulate the GCM.”

In other words, the variables themselves have better predictive power than the GCM’s. None of the GCM’s add any information that is not already in the variables.

In other words, a simple model will outperform the GCM’s, because the GCM’s are removing information. The GCM’s add no information.

Thus the billions of dollars spent on super computer simulations are simply a waste of money as far as predicting the future climate. One could simply make a simple model of the input parameters and arrive at a more accurate prediction of future climate than that provided by the climate models..

This confirms what Willis posted awhile back, that the climate models can all be reduced to a simple “black box” equation.

You are another theoretician who ignores observations and is stuck on theory. The record shows no warming the last seventeen years, and you vaguely refer to the “warming effects “. Your claim that ghg forcing has increased is presumably based on theory, because you can give no observational support to this assertion.

The statement that there has been warming these past seventeen years is false and is apparently made for propaganda reasons, not for scientific discussion.

Anthony called out Nick for his support of this blatant propaganda. If you call yourself a skeptic, then please do not provide support to alarmist propaganda.

I think that it is a matter of probability (but RGB may give a much better answer…). Even if there is only a 1% chance of re-emitting instead of redistribution of the absorbed energy, that is enough to have the energy lost to space at the end of the many collisions that occur in the atmosphere.

Ferdinand Engelbeen says:
May 29, 2014 at 9:15 am
Not a surprise that Kaufmann (not a skeptic at all) didn’t find an editor willing to publish his work…
=================
yet, from a quick look the mathematics is sound in its approach. and it is certainly significant in its contribution to new knowledge.

the primary suspicion is that the results would threaten the livelihood of a number of well funded and well established scientists and laboratories. people tend to react violently when their livelihoods are threatened.

ferdberple says: May 29, 2014 at 4:13 pm
“Ferdinand Engelbeen says:
May 29, 2014 at 9:15 am
Not a surprise that Kaufmann (not a skeptic at all) didn’t find an editor willing to publish his work…
=================
yet, from a quick look the mathematics is sound in its approach. and it is certainly significant in its contribution to new knowledge.”

Well, it’s not new knowledge. It’s an unpublished 2004 paper, and the GCM runs it describes are from last century.

And I think FE is thinking of a different Kaufman. These guys are economists.

for 14 years between 1996 and 2010, the amount of radiative forcing at ground zero for Mann Maid Glow Bull Warming, the North American Great Plains.

They found
using instruments and a timeframe selected themselves,

less radiative forcing than when they started checking.

Three quarters of a million checks ( actually 800,000) and fourteen years before: starting two years before 1998 and all through this recent hottest decade?

“A trend analysis was applied to a 14-yr time series of downwelling spectral infrared radiance
observations from the Atmospheric Emitted Radiance Interferometer (AERI) located at the
Atmospheric Radiation Measurement Program (ARM) site in the U.S. Southern Great Plains. The
highly accurate calibration of the AERI instrument, performed every 10 min, ensures that any

statistically significant trend in the observed data over this time can be attributed to changes in the

atmospheric properties and composition, and not to changes in the sensitivity or responsivity of

the instrument. The measured infrared spectra, numbering more than 800 000, were classified as

The AERI data record demonstrates that the downwelling infrared radiance is decreasing over this 14-yr period in the winter, summer, and autumn seasons but it is increasing in the spring; these trends are statistically significant and are primarily due to long-term change in the cloudiness above the site.

Take a look at the organization campuses that secured funding for and did this study. It’s done by NOAA scientists. Not skeptics of the story,

“CO2 remains in the atmosphere for hundreds of years. Its lifespan in the oceans is even longer. It is the single most important greenhouse gas emitted by human activities. It was responsible for 85% of the increase in radiative forcing – the warming effect on our climate – over the decade 2002-2012.

Between 1990 and 2013 there was a 34% increase in radiative forcing because of greenhouse gases, according to the latest figures from the U.S. National Oceanic and Atmospheric Administration (NOAA).”

Nick Stokes says:
May 29, 2014 at 4:20 pm
These guys are economists.
=============
Yes, most of the mathematics of analyzing forecasts has come from economics. Climate Science is yet to develop anything comparable.

Which is unfortunate because this would allow Climate Science to eliminate those models that are faulty. The problem for the IPCC is that they have no way to judge which models are better than others, so they need to include them all. By failing to learn and apply the techniques of other disciplines they are doomed to repeat their mistakes.

This isn’t only an IPCC problem, it is a Climate Science problem. Yet economics already knows how to evaluate models. If a model has no more skill than its inputs it is a worthless model and should be rejected, because it delivers no value. Except of course to bamboozle those dishing out the grant monies.

Nick Stokes says:
May 29, 2014 at 4:20 pm
Well, it’s not new knowledge.
==============
Has there been a more recent study of climate model forecasts? Have the models been categorized by the nature of their error? Have the model skill levels been analyzed to confirm that they exceeds that of the inputs? Please provide the links, thank.

While an economist, he did see the same problems as occured with the large multivariable programs once used for economics which all failed, while the simple programs with less variables approached to what really happened…

About the skills of the models:Here a comparison between the temperature trend (RSS MSU lower troposphere) since 1996.8, with the longest “pause” of all trends and the temperature increase caused by CO2 over the same time frame, asssuming no feedbacks (0.9 C for 2xCO2 according to Modtran). And here the same for the average of the range of 1.5-4.5 C according to the IPCC, thus 3 C for 2xCO2.
It may be clear that all models assuming a high sensitivity beyond 3 C for 2xCO2 are out of reality.

There is a small error in the plots above, as the trendline for the T-CO2 relationship should be logarithmic, but here linear,as no other ratio is possible in Wood for Trees. But the deviation is small.

Yes, my mistake. I had forgotten that he was also the author of the 2011 paper.

That is actually relevant to this thread, as he says that, indeed, CO2 forcing increased during the period, but was balanced by a rise in negative sulfate forcing.

“It may be clear that all models assuming a high sensitivity…”

My old complaint – models do not assume a sensitivity. But the logic of Kaufmann’s paper says that it isn’t a sensitivity matter. Total forcing did not increase, so no temperature rise is expected from forcing, whatever the sensitivity.

Except that the increase in SO2 doesn’t exist: the increase in SE Asia is mostly balanced with the decrease in Western Europe and North America… And the brown aerosols over India are warming, not cooling… Moreover, the alleged influence of SO2 is far overblown in a lot of models, which leads to the huge differences in sensitivity, which is mainly the result of the CO2 – SO2 tandem. But if SO2 is overblown, then there is an increase in forcing without any result in temperature…

While an economist, he did see the same problems as occurred with the large multivariable programs once used for economics which all failed, while the simple programs with less variables approached to what really happened…
=================
Anyone with an applied mathematics background in computers knows why this happens. Even the most rudimentary of linear techniques suffers from this problem.

Remember your linear programming from high school? You tried to solve for (x,y,z) using 3 linear equations. The object was to get all 1’s on the diagonal of the matrix, and 0’s elsewhere. The values for your unknowns would be to the right of the “=” sign.

But guess what happens when you try this on a computer. Even when programmed perfectly, computers are not exact. They have a small round-off error. And even with only 3 linear parameters this round off error results in errors in the results.

And this is a trivial linear problem. As you add more parameters, the errors increase, typically in a non-linear fashion. This problem not only applies to the round-off in the computer, it also applies to measurement errors in the input parameters. As you add parameters, the errors overwhelm the results.

And heaven help you if you try and go beyond a linear model. Non linear models are so sensitive to small errors as to be largely unsolvable. Sure you can solve them, but the results are unreliable.

Thus it has been found that simple models typically outperform complex models, because they are better able to control the size of the error.

Unfortunately, climate science doesn’t like this answer, so they try and sweep it under the table. They argue that Economists are not Climate Scientists. But the reality is that Economist have been computer forecasting for a lot longer than Climate Science and they have a large body of experience as how to evaluate and eliminate faulty models. Climate Science has no such body of theory or experience.

My old complaint – models do not assume a sensitivity.
=================
The model assumptions are in the parameters. So for example, if one model uses a high value for SO2, and then trains the model using historical data, this will result in a high CO2 sensitivity inferred in the model.

Similarly, if one model uses a low value for SO2, and then trains the model using historical data, this will result in a low CO2 sensitivity in the model.

This training can either be computer driven – the computer adjusts its weights for each parameter to optimize the fit. Or the humans manually adjust the weights for the parameters to optimize the fit. In both cases it is still training.

Thus, the notion that models do not assume a sensitivity is correct. This sensitivity is determined by the parameters that the humans set for the model.

The fallacy of climate models is shown by the models themselves. Take a single climate model. Small changes in the inputs, much smaller than the measurement error, result in large changes in the outputs. Two virtually identical runs of the same model result in much different forecasts.

What the models are telling us is that either of these events is possible. Temps could go up, or they might not. No matter what we do about CO2. Again Climate Science doesn’t like this answer, so they average the two runs for this single model together, and call this the forecast.

But this is mathematically incorrect and misleading to the point of scientific fraud. The model is showing that there is HIGH VARIABILITY. And this variability is not due to the forcings, because it happens with very little change in the forcings. Thus, the variability is inherent in the climate system, but it is hidden by the process of averaging.

This HIGH VARIABILITY is hidden from the scientific community by the process of averaging, and as a result the IPCC has been mislead to report that there is low natural variability. But the climate models themselves are telling us that variability is high.