Yet another fix needed for climate models – this time due to aerosols

There are so many updates and fixes needed to the climate models these days its almost like watching a car undergoing a perpetual repair process. When does the time come when the owners realize that maybe they should take advantage of the “lemon laws” and get a new model? From Johannes Gutenberg University:

International research group shows that the aging of organic aerosols is caused by OH radicals

Climate models need to be updated

Atmospheric aerosol particles have a significant effect on climate. An international team of researchers has now discovered that a chemical process in the atmosphere called aging determines to a major extent the concentration and the characteristics of aerosol particles. To date, this aspect has not been accounted for in regional and global climate models. In the Muchachas [Multiple Chamber Aerosol Chemical Aging Experiments] project, the team has not only managed to demonstrate the effects of aging but has also been able to measure these. Their findings have been published in the specialist journal Proceedings of the National Academy of Sciences of the USA (PNAS).

The quality of air is determined to a considerable extent by aerosol particles. In the form of a fine dust, they are believed to be responsible for a series of respiratory diseases and cardiovascular disorders. In addition, aerosol particles also have various effects on atmospheric radiation balance. Aerosols make a direct contribution to radiation levels in the cloud-free atmosphere by dispersing, reflecting, and absorbing sunlight. Aerosols are also essential for cloud formation in the troposphere: They act as condensation nuclei which even in the presence of low levels of water vapor do enable droplets to form.

The size and concentration of aerosol particles is also of great importance for the number of cloud drops, which in turn influences the reflection characteristics of clouds. Hence, aerosol particles tend to have a cooling influence on the atmosphere. However, the precise processes and feedback mechanisms have not yet been fully understood, so that the interaction between aerosol particles, their suitability as cloud condensation nuclei, and the sunlight reflected off the earth’s surface represented one of the greatest uncertainties in the calculation of climatic activity.

The Muchachas project looked at organic aerosols, which constitute the largest proportion of chemical airborne particles. Organic aerosols are generated above forests, for example, and they are visible in the form of a blue mist in certain places such as the Great Smoky Mountains, the Blue Ridge Mountains, and the Blue Mountains. In densely populated areas however, anthropogenically generated and released hydrocarbons play an important role as precursor of the development of secondary organic aerosols.

The experiments showed that the mass and composition of organic aerosols are significantly influenced by OH radicals. OH radicals are the most important oxidants in the atmosphere and make an important contribution to keeping air clean. Researchers from Pittsburgh (USA), Juelich, Karlsruhe, and Mainz (Germany), Gothenburg (Sweden), Copenhagen (Denkmark), and Villigen (Switzerland) analyzed results in four different, large-volume atmospheric simulation chambers and found that the oxidation process called chemical aging has a significant impact and influence on the characteristics and concentration of organic aerosols over their entire life cycle.

“New climate models will have to take these findings into account,” says Professor Dr. Thorsten Hoffmann of the Institute of Inorganic Chemistry and Analytical Chemistry at Johannes Gutenberg University Mainz (JGU) in Germany. The Mainz researchers contributed primarily to the development of analytical techniques for studying the chemical composition of the aerosol particles in the Muchachas project. Thanks to their development of so-called ‘soft ionization’ techniques and the corresponding mass spectrometers, Hoffmann’s work group was able to track the concentration of individual molecule species in the atmospheric simulation chamber and thus observe the chemical aging of the atmospheric aerosols at the molecular level. It was clearly demonstrated that oxidation occurred in the gaseous phase and not in the particle phase. “Now the goal is to integrate these underlying reactions in models of regional and global atmospheric chemistry and so reduce the discrepancy between the expected and the actually observed concentrations of organic aerosol particles,” explains Hoffmann.

Need to be fair here. The models, despite 30 years of effort, are not able to predict very well, so they SHOULD be updated frequently as tiny bits of understanding are gained. The real question is: is the effort worth it, or are the gains too small, especially when it takes a couple of decades to validate the prediction with observational evidence.

Since climate models ignore the effect of clouds, no surprise that they have also neglected hydroxyl radicals, the “detergents of the atmosphere”. Just because the concentration of CO2 is measured in dry air, does not mean that water vapor (up to 40,000 ppm above tropical forests) in the atmosphere can be ignored. It usually swamps the feeble effects of carbon dioxide (390 ppm).

“Hence, aerosol particles tend to have a cooling influence on the atmosphere.”
–
This is great news. Now we have a way to prevent global warming by adjusting the amount of aerosol particles in the atmosphere. So there’s no need to worry about CO2 and no need to send civilization back into the dark ages by cutting the use of fossil fuels. In other words, we can counteract the exaggerated warming effects of CO2 with the exaggerated cooling effects of aerosol particles. Problem solved!

But I suggest we wait until warming has actually been observed and has proven itself “harmful” before we start releasing additional aerosol particles into the atmosphere. We wouldn’t want start a new ice age or inflict society with other unintended consequences until absolutely necessary.

“Hence, aerosol particles tend to have a cooling influence on the atmosphere.”

Only when they reach the stratosphere and there is very little evidence it does with the exception of powerful volcanic eruptions.

SAOT levels have declined over recent years, showing the non-warming period has very little to do with this.

Human SO2 levels had been declining until 2005 (end of data) while global temperatures during the same period were rising for majority of it. Notice no trend is detected between SAOT levels and human SO2 levels because they don’t reach the stratosphere. The declining human SO2 levels were blamed for the previous cooling period while a recovery was supposed to correlate here.

The models adjusted this correlation and tried to blame aerosols on this, but the main problem being adjusting for warming periods and recent non-warming period it just doesn’t work at all. The factor is much larger than any noticeable difference change for the period after.

When adjusting global temperatures using SAOT levels these do not have any influence on the non-wamring period. Hence, the period is still not warming.

They point out that when an object is large compared to the wavelength of the radiation it is emitting, then surface effects dominate. But when an object is small compared to the wavelength, then radiation can be emitted from any point within its volume. In that case the geometry of the particle must play a role.

To prove the point, they measured the heat radiated by a silicon nanofibre with a diameter of 500nm, which is much smaller than the wavelength of thermal radiation.

They show that this heat emission cannot be described by Planck’s law, even when a correction factor is applied.

Instead, Wuttke and Rauschenbeutel accurately model the output using another theory called fluctuational electrodynamics, which takes into account the geometry of the experiment.

These wonderful models fail –
The heat is missing from oceans;
The heat is missing from the upper troposphere.
The clouds do not behave as predicted.
The models can not predict the short term, the regional, or the long term.
The models can not deduce the past climate from current data.
How could we believe that models can predict the future?

Weren’t aerosols the ‘fudge factor’ in CAGW models? Isn’t the lack of data about aerosol (essp SO2) the entire ‘secret sauce’ for making CAGW models back fit temperature data?

I answer, YES!
None of the models – not one of them – could match the change in mean global temperature over the past century if it did not utilise a unique value of assumed cooling from aerosols. So, inputting actual values of the cooling effect (such as the determination by Penner et al.http://www.pnas.org/content/early/2011/07/25/1018526108.full.pdf?with-ds=yes )
would make every climate model provide a mismatch of the global warming it hindcasts and the observed global warming for the twentieth century.

This mismatch would occur because all the global climate models and energy balance models are known to provide indications which are based on
1.
the assumed degree of forcings resulting from human activity that produce warming
and
2.
the assumed degree of anthropogenic aerosol cooling input to each model as a ‘fiddle factor’ to obtain agreement between past average global temperature and the model’s indications of average global temperature.

More than a decade ago I published a peer-reviewed paper that showed the UK’s Hadley Centre general circulation model (GCM) could not model climate and only obtained agreement between past average global temperature and the model’s indications of average global temperature by forcing the agreement with an input of assumed anthropogenic aerosol cooling.

The input of assumed anthropogenic aerosol cooling is needed because the model ‘ran hot’; i.e. it showed an amount and a rate of global warming which was greater than was observed over the twentieth century. This failure of the model was compensated by the input of assumed anthropogenic aerosol cooling.

And my paper demonstrated that the assumption of aerosol effects being responsible for the model’s failure was incorrect.
(ref. Courtney RS An assessment of validation experiments conducted on computer models of global climate using the general circulation model of the UK’s Hadley Centre Energy & Environment, Volume 10, Number 5, pp. 491-502, September 1999).

Kiehl found the same as my paper except that each model he assessed used a different aerosol ‘fix’ from every other model. This is because they all ‘run hot’ but they each ‘run hot’ to a different degree.

He says in his paper:

One curious aspect of this result is that it is also well known [Houghton et al., 2001] that the same models that agree in simulating the anomaly in surface air temperature differ significantly in their predicted climate sensitivity. The cited range in climate sensitivity from a wide collection of models is usually 1.5 to 4.5 deg C for a doubling of CO2, where most global climate models used for climate change studies vary by at least a factor of two in equilibrium sensitivity.

The question is: if climate models differ by a factor of 2 to 3 in their climate sensitivity, how can they all simulate the global temperature record with a reasonable degree of accuracy. Kerr [2007] and S. E. Schwartz et al. (Quantifying climate change–too rosy a picture?, available at http://www.nature.com/reports/climatechange, 2007) recently pointed out the importance of understanding the answer to this question. Indeed, Kerr [2007] referred to the present work and the current paper provides the ‘‘widely circulated analysis’’ referred to by Kerr [2007]. This report investigates the most probable explanation for such an agreement. It uses published results from a wide variety of model simulations to understand this apparent paradox between model climate responses for the 20th century, but diverse climate model sensitivity.

And, importantly, Kiehl’s paper says:

These results explain to a large degree why models with such diverse climate sensitivities can all simulate the global anomaly in surface temperature. The magnitude of applied anthropogenic total forcing compensates for the model sensitivity.

And the “magnitude of applied anthropogenic total forcing” is fixed in each model by the input value of aerosol forcing.

It shows that
(a) each model uses a different value for “Total anthropogenic forcing” that is in the range 0.80 W/m^-2 to 2.02 W/m^-2
but
(b) each model is forced to agree with the rate of past warming by using a different value for “Aerosol forcing” that is in the range -1.42 W/m^-2 to -0.60 W/m^-2.

In other words the models use values of “Total anthropogenic forcing” that differ by a factor of more than 2.5 and they are ‘adjusted’ by using values of assumed “Aerosol forcing” that differ by a factor of 2.4.

So, each climate model emulates a different climate system. Hence, at most only one of them emulates the climate system of the real Earth because there is only one Earth. And the fact that they each ‘run hot’ unless fiddled by use of a different and completely arbitrary ‘aerosol cooling’ strongly suggests that none of them emulates the climate system of the real Earth.

“Models, whether of climate predictions or whatever seem to generate far more heat than light. Are there any examples of them ever being worth the effort?”

So the real cause of ‘climate change’ is the extra heat being produced by all these models? Sounds like research grant material to me. What we need is another model to model the climatic effects of climate models!

Climate Models, Why are they needed? they obviously become redundant every time real world data is observed and need to be updated with a new X anthropogenic variable just to save-face or so it seems, the under lying physics taking place in the real world is very poorly represented by these augmented reality models. What annoys me most about them is, before the time has been taken to verify the accuracy or lack there of, they usually make headlines in the media as if they are anything less than hard factual evidence until proven wrong.

Bryan says:
October 5, 2012 at 8:09 am
“Hence, aerosol particles tend to have a cooling influence on the atmosphere.”
Very convenient excuse required because global average temperature has failed to rise for the last 14 years.

The CO2 fraction has been increasing but temperature is completely uncorrelated.

The ‘hockey stick blade’ is missing.
=============================================================
Maybe we should have a hockey game with all the players using “hockey sticks” based on the real temperature trend?
(Well, maybe not. Whenever they swung their sticks they’d all get self-inflicted knee injuries from the Little Ice Age.)

http://www.ipcc-data.org/ddc_scen_selection.html
The original fudge factor is still there. Criterion 1. Last updated 28 November 2011.
I guess it’s too hard to write a new Criterion 2: “If Criterion 1 doesn’t work, break out another tin of builders bog”.

A new type of radical for the CAGW crowd – at least these don’t attempt to win debates by insult and hate speech instead of rigorous scientific analysis.

Did I simply come in late and actually miss the robust debate about the basic science of the whole shebang or has there never actually been one other than some anecdotal discussions around glaciers, polar bears and ex vice presidents with a carbon footprint and bank account I’d love to ascribe to ?

“Organic aerosols are generated above forests, for example, and they are visible in the form of a blue mist in certain places such as the Great Smoky Mountains, the Blue Ridge Mountains, and the Blue Mountains.”

This overlooks the role played by non-visible smaller particles (<0.1 microns) and their interaction with UV and EUV when available. It does not have to be low in the troposphere.

Two institutions historically active in particle ageing are the Univ of Stuttgart and the Univ of Johannesburg. An impressive project to light large grass fires in South Africa was led by Prof Annegarn (UJ) in about 2001 which involved simultaneous observation with various instruments by satellite, U2 and low altitude airfraft. Grass fires are a significant source of aerosols and BC/OC. Recall the Brown Cloud of India etc.

There are so many updates and fixes needed to the climate models these days its almost like watching a car undergoing a perpetual repair process.
————
Yep! this is how all models are developed from day 1 of their existence.

By day 2 the model has captured 80% of the behavior of the physical system they represent. By day 3 they capture 90%, by day 4 they capture 95%, by day 5 they capture 97.5%, on day 6 they replace the core code to improve its computational efficiency and get another small increase in accuracy by tweaking the quality control knobs, etc. etc..

So the whole process resembles modding a car; adding a better carby, replacing the exhaust system with something more efficient, lowering the ride height, installing better shocks, putting nitro in the fuel.

The fact that the scientists go to all this fuss and bother tells you scientist’s care very much about the accuracy of their models. This also tells you scientists are not faking the results, a claim made by many liars who have never even seen the source code for these models.

Anthony,
I like the car analogy, I built a kit car many years ago and quickly discovered that you never actually finish building one, there are always repairs tweaks and modifications to carry out. I think I spent as much time doing this work as I did driving it. Also spent lots of time at car shows and club meetings discussing how to do these modifications and repairs.

I now have a mental picture of these guys spending hours doing lots of modifications and repairs to the code and data and every couple, up to their elbows in bits of paper, on the internet forums and blogs, then changing the changes and finally every couple of months running the model and being shocked by the results. Then panicing the MSM with the horor of what they’ve found.

The difference between someone maintaining an old/kit car and a climate modeller is that the CAGW climate modeller thinks that after each iteration he’s got it right whereas the car mechanic knows that something else will go wrong with the car sooner or later.

LT
This also tells you scientists are not faking the results, a claim made by many liars who have never even seen the source code for these models.
——–

They like teenagers have yet to learn that they don’t know everything and that they get things wrong more often than right (or never right so far). What they seem to care about is fame and money they are part of the teenage “I’m worth it” generation

“New climate models will have to take this into account” Indeed yet they are the models that prove global warming. Billions wasted and ecomony’s ruined because of models that are a total crock. How can the money grubbing useful idiots keep up the pretense and still lie straight in bed.

I have no problem with scientists attempting to understand how the Creator put this Creation together. The problems seem to occur when men seek to control Creation.

I recently was taken to task on a different thread for speaking with wonder of the sheer magnitude of creation. My awe was seen as a sort of defeatism. The scolding made me think.

I suppose it is human nature to try to control Creation. The simple fact I weed my garden and squash bugs and worms alters the ecosystem, and the fact I add manure makes me a chemist, altering soil chemistry. The more I learn, the better the gardener I may be, but this doesn’t put me in control. Instead I am responding. I am not a dictator, ordering the clouds about. Rather I am a dancer, listening to a music I myself don’t play.

There was one WUWT thread which discussed the organic molecules plankton puts into the air, and how they can reach the upper atmosphere, and, (because they included elements ending in “ine,” such as Bromine, Fluorine, Chlorine, and Iodine,) alter the amount of Ozone, which in turn effected the types and powers of sunlight.

You can try to include such factors in a model, along with many other factors. Who knows, if you told bleeping politicians to buzz off, and worked with thousands of other dedicated scientists, you might even create a decent model. Then, once you had a sort of understanding of how things work, you could even attempt to improve the weather by fertilizing or weeding out the sea’s plankton. However you are still involved in actions and reactions, and still are responding to constant changes, (and the Laws of Unintended Consequences,) and are still basically a dancer.

Even among environmentalists there is a sort of split-personality: On one hand they are horrified by any meddling with how nature does things, and on the other hand they want to control and engineer the weather of the entire planet.

The truth is that we are in essence dancing to the Creator’s tune. At times it is a lovely waltz, and at other times it is bullets about our feet.

If we learned something new, and scientists _didn’t_ update their models you would have a much stronger criticism.

With regard to the actual content of the press release quoted, it isn’t clear if the process they report on (aerosol particles, particularly organic chemicals, getting smaller over time) makes them better or worse at forming clouds and their other atmosphere cooling functions.

In the real world, living things perpetuate themselves by ‘survival of the fit enough at the time’ and all living things alter their local environment to enhance their own survival. That’s right. Every living thing alters its local environment to enhance its own survival. The biosphere ‘terraforms’ planet Earth every hour of every day. The biggest impacts are made by single celled organisms: plant, fungal, and bacterial. Man is way down on the list; but we can see and think about what we do. We often do not see or think about what the rest of nature is doing. Plus, things done by the arts of Man are no less natural than things done by any other life form on this rock.

Atmospheric aerosol particles have a significant effect on climate. An international team of researchers has now discovered that a chemical process in the atmosphere called aging determines to a major extent the concentration and the characteristics of aerosol particles.

This is no where near new. I can remember the group of acid rain activists I fixed equipment for talking about this effect in the late fifties and early sixties with pretty well exactly the same conclusions. They did it with crude boxes costing a few pounds UK not supercomputers but the answers are so near identical as to be just copies of the original work.

The climate models ignore far too much to be taken as anything more that rough guide and in no way are good enough to justify any action based on them. For a start there should never be any significant natural source of CO2 gas discovered that has not been included in the models or the claims for it have to be considered false. Where also is the proof that CO2 does not stabilise at precisely the same level regardless of man’s activities as it is a production and use system which is and always has been self balancing in the longer term? Until there is absolute proof that this is not the case then CO2 science should be considered merely a guess.

Yep! this is how all models are developed from day 1 of their existence.

By day 2 the model has captured 80% of the behavior of the physical system they represent. By day 3 they capture 90%, by day 4 they capture 95%, by day 5 they capture 97.5%, on day 6 they replace the core code to improve its computational efficiency and get another small increase in accuracy by tweaking the quality control knobs, etc. etc..

The problem with this process is that we are currently sitting at about 8:15 AM on Day 1, and these jokers are pretending that they have Day 7 quality models. These models cannot make useful predictions over useful time periods at useful resolutions, let alone explain anything. And they won’t be able to, until long after we are all dead. And there is nothing wrong with that. The only problem is when people pretend otherwise, for political purposes.