UPDATE PM JANUARY 16 2010 – Jim Hansen has released a statement on his current conclusions regarding the global average surface temperature trends [and thanks to Leonard Ornstein and Brian Toon for alerting us to this information]. The statement is If It’s That Warm, How Come It’s So Damned Cold? by James Hansen, Reto Ruedy, Makiko Sato, Ken Lo

My comments below remain unchanged. Readers will note that Jim Hansen does not cite or comment on any of the substantive unresolved uncertainties and systematic warm bias that we report on in our papers. They only report on their research papers. This is a clear example of ignoring peer reviewed studies which conflict with one’s conclusions.

***ORIGINAL POST***

Thanks to Anthony Watts for alerting us to a news release by NASA GISS (see) which reads

“NASA has not been involved in any manipulation of climate data used in the annual GISS global temperature analysis. The analysis utilizes three independent data sources provided by other agencies. Quality control checks are regularly performed on that data. The analysis methodology as well as updates to the analysis are publicly available on our website. The agency is confident of the quality of this data and stands by previous scientifically based conclusions regarding global temperatures.” (GISS temperature analysis website: http://data.giss.nasa.gov/gistemp/)” [note: I could not find the specific url from NASA, so I welcome being sent this original source].

This statement perpetuates the erroneous claim that the data sources are independent [I welcome information from GISS to justify their statement, and will post if they do]. This issue exists even without considering any other concerns regarding their analyses.

I have posted a number of times on my weblog with respect to the lack of independence of the surface temperature data; e.g. see

The GISS news release is symptomatic of the continued attempt to ignore science issues in their data analysis which conflict with their statement in the press release. This is not how the scientific process should be conducted.

We urge, based on the exposure of such type of behavior in the CRU e-mails; i.e. see

I’m getting emails from people who have read blog postings accusing me of “hiding the increase”….I switched from 13 months to a running 25 month average….25-month smoother minimizes the warm 1998 temperature spike, which is the main reason why I switched to the longer averaging time. If anything, this ‘hides the decline’ since 1998….accusations I have hidden the increase. Go figure.

NASA GISS doesn’t Mannipulate data they say, but they refuse to cooperate with releasing the un Mannipulated data. Hansen claims to have his nose above FOIA requests.
Hansen’s stubbornness will backfire.

Quality control checks are regularly performed on that data. […] The agency is confident of the quality of this data
I seem to remember that Gavin Schmidt excused their various data problems by saying that they only have a 0.25 employee to do the quality checking. Not even a single full-time equivalent. So hard to be ‘confident’ in my book.

Jim Hansen and the rest of the hockey team do not need to address unresolved conflicts with other papers as long as they hold the aura of “the authority”. Acting as the singular authority all other views can be ignored as unsupported, unproven, or just tainted by big oil.

As soon as the hockey team is shown to be just one more viewpoint tainted by their own funding sources and ethical problems the appeal to authority argument will no longer work.

Hmmm that last sentence should begin with “Now that” instead of “As soon as”.

Mr. Hansens paper show cute little graphs run on the same heavliy modified and cherry-picked GISS datasets, as well as the equally derived HadCRUT.
Rural datasets, on the other hand, do not show the ghastly warming. Anomaly maps might be good for analysis, but they are not the reality. They are used by Hansen and others to hide the fact that the modified temperature set is not the observed temperature set.
It is true that there has been some warming, but NOT the high temps. What they don’t want you to know is that the warming was due to increased nighttime temps. I say was, because that is coming to a screeching halt.
Also, Mr. Hansen says
“For example, if it is an unusually cold winter in New York, it is probably unusually cold in Philadelphia too. This fact suggests that it may be better to assign a temperature anomaly based on the nearest stations for a gridbox that contains no observing stations, rather than excluding that gridbox from the global analysis. ”

Really? I’m supposed to believe that in order to get a better picture, I should run Hubble Space Telescope equipped with a mask to let in a cone of light the size of a coffee can lid.
No wonder the picture they paint is so darned hazy. At the rate of data they are considering, it will take 500 years to get the next 100 years worth of data points.

The Hansen et al paper is interesting particularly the tone of the comment towards then end of paper where they say “This information needs to be combined with the conclusion that global warming of 1‐2°C
has enormous implications for humanity.” Isn’t the 1‐2°C on the low side for current estimates for the impact of CO2?

I’ve not yet come across any UAH or RSS source code that processes this raw data. However, there is some general purpose source code for working with this data here:http://nsidc.org/data/amsre/tools.html

The raw data has the following noteworthy properties:

*) It’s huge. I’d estimate a single day of temperatures will be about 2.5 gigabytes of data.
*) It mixes binary and text data in a single file. The text data is in hierarchical forma, but it’s not XM, JSON, or any other standard format. It uses custom tags to define the hierarchy.
*) It includes all levels of the atmosphere. Usually, you’ll just want the troposphere, so you’ll have to extract that information out manually.

Data sold separately. Another flagrant abuse of public money funded research.
We pay $billions in taxes for them to do the gathering, they walk all over it, and we’re supposed to pay them exorbitant sums to have it emailed to us.
Would you pay big bucks to see a Hubble shot that was doctored in Photoshop to look like somebody tossed paint on the wall?
Of course you wouldn’t.

I’d be willing to bet that the $$$ they are charging for information already paid for with our tax money is getting diverted into other agendas or pockets.

What exactly are the “quality control checks” that NASA claims “are regularly performed on that data”?

When they get the temperature data to conform to their models, then they are assured of the quality of their data. And the incoming raw data has been so lousy for so long, requiring so much work to adjust it until it is of acceptable quality, it’s no wonder Hansen doesn’t want to release it!

“Isn’t the 1‐2°C on the low side for current estimates for the impact of CO2?”

There is Hansen’s projection and the IPCC projection.

Hansen’s orginal 1988 scenario’s ‘A’ have been discredited by observation.

In 2000, hansen changed his tune and started talking about soot from coal being the main problem.http://www.giss.nasa.gov/research/news/20000829/
“However, new research suggests fossil fuel burning may not be as important in the mechanics of climate change as previously thought. ”

There is also a semantic game that gets played based on how much global warming since ‘when’. 2 degrees ‘C’ compared to 1850 or 2 degrees ‘C’ compared to now.

I eagerly await Anthony’s paper on the trends that can be discerned from his collaborative effort at SufaceStations.org. I realize that I am projecting my own biases somewhat, but I am sick of the “Team” getting uncritical reviews.

I am very annoyed by NCDC’s efforts to muddy the waters with their preemptive paper using a subset of the SurfaceStations data (and without properly citing Anthony too) . It stinks of interference in the peer review process and looks remarkably similar to what Mann and Jones were doing as demostrated in the Climategate emails.

The bottom line is this. Every day we now see more evidence of deliberate or otherwise alteration to climate data, or lies being told to such an extent that organizations such as the Met Office, IPCC, UN, etc. are no longer to be trusted, and in some cases are liable for fraud. The representatives of these will next be held accountable and will suffer the consequences.

1. The raw data for every station used, for the entire period of analysis
2. The algorithm (recipe) for:
i. Averaging across the earth’s surface (interpolation across earth’s surface)
ii. Averaging across time (interpolation over time)
3. Any further algorithms to *legitimately* adjust for:
i. Changes in weather station (e.g. location, instruments, etc.)
ii. Changes to surrounds (e.g. urban build up)

Interesting graph, now I understand that Hansen seems to me to be leading the pack of Alarmist doomsayers, but on that graph the Hansen C looks pretty good. What am I failing to understand here? Did he make a good projection? Add fudges after making it?

Not according to thier site: They want to sell you data & products you have already paid for. Yes, I know where to get it free, but that’s not the point.
If it were private enterprise, and they were selling the crap that their flubbed forecasts were based on, they’d be filing Ch. 11.

I did a search on “NASA” and “ISO 9001” and one of the resultant links took me to a page that gave me a choice where I could choose a sub category of “NASA Headquarters QMS Homepages.http://www.hq.nasa.gov/office/codej/codeji/codeji.html .
Where is says:
“The mission of the Headquarters ISO 9001 Program Office,
Code JI, is to provide leadership and overall management for
the Headquarters-wide Quality Management System (QMS)
that is certified to ISO 9001.

I clicked on the “internal” link and got “Server not found”.
Hmmmmmmmm!

As a geologist and a scientist, what I don’t understand is how changes in a trace gas (carbon dioxide), which has a concentration of less than one part in 2,500 in our atmosphere, can have more than a minuscule effect on our climate.

I accept the fact that carbon dioxide, like water vapour, is a ‘greenhouse gas’ when measured in percentage terms – but 0.038% – that’s like putting a magazine in the attic of your home and expecting the whole building to heat up.

We never see anything from Hansen et alia showing how to quantify the relationship between carbon dioxide levels and global temperature for the simple reason the science does not exist – what we do see instead is models which cannot even be manipulated enough to reflect what happened in the past. When these same guys start talking about methane and global warming, it just becomes totally stupid – 1.7 parts per million – how can that affect anything?

How many people remember being taught about the phlogiston theory at school? AGW is the same type of theory – complete garbage.

An interesting question to ask the average supporter of global warming theory is: “What is the percentage of carbon dioxide in the atmosphere?” Usually, less than one in twenty say it’s under 1%. When they are told the actual number, a healthy scepticism usually begins to take root.

Since the forcings on which Scenario A was based didn’t come to pass, this is not surprising. Scenario A was a worst case scenario in which greenhouse gas forcings increased rapidly and there were no major volcanic eruptions. Scenario B was the middle-of-the-road case…and, in fact, the forcings used for Scenario B turn out to be quite close (although apparently still a little bit high) of what came to pass.

As a geologist and a scientist, what I don’t understand is how changes in a trace gas (carbon dioxide), which has a concentration of less than one part in 2,500 in our atmosphere, can have more than a minuscule effect on our climate.

I accept the fact that carbon dioxide, like water vapour, is a ‘greenhouse gas’ when measured in percentage terms – but 0.038% – that’s like putting a magazine in the attic of your home and expecting the whole building to heat up.

I am not sure how you get your intuition. What is your intuition for what would happen to you if you were placed in a room with 0.038% of plutonium? As a scientist, you ought to understand that intuition is only a guide that works when it is informed by experience or hard scientific understanding.

There is two good reasons why CO2 has a disproportionate effect on the climate:

(1) 99% of the constituents of the atmosphere are transparent to IR radiation so the remaining ~1% have a disproportionate effect.

(2) The effect of concentration on radiative forcing is logarithmic over a fairly large range of concentrations, which again means that there can be a disproportionate effect.

Besides which, a change of ~6 C in the global temperature is a small change on an absolute temperature scale (about 2%) and yet this is enough to mean the difference between glacial and interglacial conditions.

We never see anything from Hansen et alia showing how to quantify the relationship between carbon dioxide levels and global temperature for the simple reason the science does not exist – what we do see instead is models which cannot even be manipulated enough to reflect what happened in the past.

“As a geologist and a scientist, what I don’t understand is how changes in a trace gas (carbon dioxide), which has a concentration of less than one part in 2,500 in our atmosphere, can have more than a minuscule effect on our climate.”

Really? Most of the atmosphere (O2 and N2) is transparent to infrared radiation, so trace gases can have a large impact. We know that without the greenhouse effect caused by trace gases (yes, that includes water vapor) the earth would be about 33 degrees cooler. The reduction in outgoing infrared radiation at the wavelengths absorbed by CO2 has been measured by satellite.

In 1970, NASA launched the IRIS satellite that measured infrared spectra between 400 cm-1 to 1600 cm-1. In 1996, the Japanese Space Agency launched the IMG satellite which recorded similar observations. Both sets of data were compared to discern any changes in outgoing radiation over the 26 year period (Harries 2001). The resultant change in outgoing radiation was as follows: [see link for chart]

What they found was a drop in outgoing radiation at the wavelength bands that greenhouse gases such as carbon dioxide (CO2) and methane (CH4) absorb energy. The change in outgoing radiation is consistent with theoretical expectations. Thus the paper found “direct experimental evidence for a significant increase in the Earth’s greenhouse effect”.

A machinist would say that the C02 concentration in the atmosphere is .38 of a thousandth.
Try opening your dial caliper to 1/3 of a thousandth. Takes some practice and due diligence to deal with the backlash.
Hold it up to the light. See the Light?
What are the odds of a photon in any given moment of hitting a C02 atom?
380 in a million.
This is nuts.

The AMSR-E/Aqua Daily L3 12.5 km Brightness Temperatures, Sea Ice Concentration, & Snow Depth Polar Grids (AE_SI12) product using the V11 algorithm contains an error in the Snow Depth on Sea Ice parameter in data prior to 05 October 2009. Users of this product should use caution when working with the Northern Hemisphere Snow Depth on Sea Ice parameter in data prior to 05 October 2009 until the data have been reprocessed with the V12 algorithm. We apologize for the inconvenience.

OMG! Grab the data quick before they fix it. Oh wait, they already fixed it 11 times, that’s why it has an error.

As a geologist and a scientist, what I don’t understand is how changes in a trace gas (carbon dioxide), which has a concentration of less than one part in 2,500 in our atmosphere, can have more than a minuscule effect on our climate.

I accept the fact that carbon dioxide, like water vapour, is a ‘greenhouse gas’ when measured in percentage terms – but 0.038% – that’s like putting a magazine in the attic of your home and expecting the whole building to heat up.

At certain important infrared wavelengths of light, a fraction of the current concentration is almost opaque. It’s unfortunate that we have so few colored gasses, chlorine (green) and iodine (brown, but you have to heat the iodine to evaporate enough of it) are the best examples people might have experience with. However, try to imagine CO2 as absorbing red light so it appears as a blue-green gas. All the absorbed light is reradiated (gases at Earth’s temperatures are luminous in IR wavelengths) in all directions, the net effect is to slow down Earth’s cooling.

Perhaps another way to look at it is to imagine taking some of the CO2 in the atmosphere and turn it into a sheet of mylar in the middle of the troposphere. That would have a huge impact on climate. (And would make the world’s biggest greenhouse to boot!)

What you can say, and this is one of the key reasons I gave up on AGW, is that there’s so much CO2 in the atmosphere that adding more has much less impact than it did before it became nearly opaque.

A machinist would say that the C02 concentration in the atmosphere is .38 of a thousandth.
Try opening your dial caliper to 1/3 of a thousandth.

A thousandths of what? Meter? Millimeter? Inch? Firkin? (Yes, I know that’s volume, but I like the name) Thickness of the atmosphere?

Mylar is sold in thicknesses like 0.0005″. (That’s about a micro-meter. Really thin mylar is available at 0.000059″.) Cover a doorway with a piece. Measure the environmental impact – it will stop air flow, maybe people flow. And still, it just adds a few grams to the weight of the building. Trace materials matter.

Hold it up to the light. See the Light?
What are the odds of a photon in any given moment of hitting a C02 atom?
380 in a million.

Is there math behind that? Don’t forget that an IR photon is much larger than a CO2 molecule. That’s why we can’t image atoms with visible light. One mole of air is 22.4 liters at STP IIRC. That’s Avogadro’s number of molecules, about 6 x 10^23 molecules. That’s more than 2 x 10^20 CO2 molecules in a space smaller than a cubic foot. Beyond that, I don’t know what the odds are, but I think I’ve read that the right wavelengths, an IR photon gets captured in a few meters. That seems awfully short to me and I suspect its wrong. I do think its unlikely such a photon can make it out of the atmosphere.

An incoming photon is absorbed by a CO2 molecule a couple of miles up. The CO2 molecule almost instantly re-emits a photon in a random direction, sloughing off the extra energy as it returns to its ground state. The re-emitted photon either goes down and hits the ground, adding energy and warming the planet, or it is re-emitted upward and lost to space, taking its extra energy with it.

Thus, 100% of radiation from the Sun to the Earth doesn’t warm the planet. Some of it is returned to space because of carbon dioxide. And that amount is more than 50%.

If CO2 at ground level absorbs a photon, chances are close to 50/50 that any re-emitted photon will hit the ground. But as we go up in altitude, the photons hitting the ground will decline from 50%.

Think of it this way: if a CO2 molecule half way between the Sun and the Earth absorbs and re-emits a photon, there is only a very small chance of that re-emitted photon hitting the Earth. Most will fly off into space, or back toward the Sun. The higher the altitude, the fewer re-emitted photons will warm the Earth. The rest escape to space.

So in our little thought experiment, atmospheric CO2 would tend to cool the planet by keeping some of the Sun’s incoming radiation from warming the Earth; CO2 acts as a partial reflector of solar energy.

Smokey (17:49:06) :Thus, 100% of radiation from the Sun to the Earth doesn’t warm the planet. Some of it is returned to space because of carbon dioxide. And that amount is more than 50%.
Presumably some [most I would reckon] of the photons zip right through without encountering a CO2 molecule. You say ‘add some CO2’. OK, I’ll add exactly ONE molecule. That single one still cuts the amount reaching the ground by more that 50%?

An incoming photon is absorbed by a CO2 molecule a couple of miles up.
But your though experiment does not represent reality. CO2 is transparent to incoming visible light. It’s the outgoing IR that is impeded by CO2. Here’s the spectrum. Is CO2 really located halfway between the sun and the earth?

Old Jim Hansen just trots out his same old assumption. He believes that he can estimate temperatures in adjacent grids that have no sensors and do so without falsifying the data.

Response 1: Of course, he can. If you do not measure temperature in the adjacent grid, your estimate cannot be wrong.

Response 2: Now, that is some funky science. Sure, jim, estimate the adjacent grid. But the science begins when you install sensors in the next grid and use them to verify or falsify your estimate.

Old Jim Hansen should understand that we will be satisfied with nothing less than sensors in each grid and a comparable number of sensors throughout the atmosphere. There’s Old Jim’s next grant proposal and he should get to work on it if he wants to be taken seriously.

Back in mid-January 2009, GISS/NASA’s benighted James Hansen issued a pronunciamento to effect that ’09 would register as “the hottest year on record.” Of course he said the same for 2006, 2007, and 2008, all of which tumbled out-of-bed on the downside.

Taking note of this extravagance, Dr. Pielke, Sr. bestowed on Hansen his coveted Bonehead Award for 2009. In line with Dr. Pielke’s new-minted tradition, perhaps this year’s Bonehead Trophy should go either to Ban Ki-moon (a truly impenetrable skull) or to Rajendra Pauchauri, his kindred as an oh-so-diverse Statist oligarch. Lord knows, there are candidates enough to choose.

What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one? Is it your imaginary “windows”? Guess what, they are exactly that, IMAGINARY! There are no windows in the atmosphere, there is only the vacuum beyond it. The spectral “windows” are of no more importance than the lines on the map of the world, which separate countries from each other and are also – wait for it – IMAGINARY!

I have a suggestion for you true believers. The next time the temperature in your house starts heading south, simple open the CGA valve on a very large cylinder of compressed CO2 and bask in the warmth (Seal your windows first, we wouldn’t want you to pollute the atmosphere the rest of us are trying to enjoy, would we?).

Actually, I may not have been clear enough when I said 100% of solar radiation doesn’t warm the planet. I was referring only to those photons interacting with CO2 molecules, and which are then re-emitted into space. So a small part of the total [100%] incoming radiation from the Sun will be lost to space due to CO2.

Maybe 99.99% of photons are making it through the atmosphere because they avoid CO2 absorption, and 50% of those left [or .005% – half of the .01% that are absorbed] are ‘reflected’ back to space by CO2, leaving only the remaining .005% to warm the planet. You’re making me sweat here.

My point is that without any CO2, all of the incoming photons would pass through the atmosphere. But with CO2 there might be a cooling effect, because some of the photons captured by CO2 are re-emitted back to space, and cannot warm the planet. So adding CO2 may bleed off incoming solar energy, and have a slight cooling effect.

No more thought experiments! From now on I’m sticking to charts & graphs.

Smokey (19:01:31) :because some of the photons captured by CO2 are re-emitted back to space, and cannot warm the planet.
the standard argument is that of the LW photons trying to escape, some are returned to the surface by your very argument… hence warming.

You’re right, I had forgotten about outgoing radiation. Once I got started, my post took on a life of its own.

But my example of a CO2 molecule half way between the Sun & the Earth is entirely reasonable. As Leif points out, what if it’s only one CO2 molecule? There can’t be one CO2 molecule passing half way between the Earth and the Sun? After all, it was just a thought experiment to show that a re-emitted photon wouldn’t be likely to hit the Earth from that distance/altitude.

And, they do NOT estimate temperatures from temperatures in adjacent grids. They estimate temperature ANOMALIES in this way. The two are very different beasts: Temperatures are not strongly correlated over distance. (In fact, if you consider a mountainous region like the top of Mt. Washington and the valley only a few miles away, you can see how a few miles can make a huge difference in surface temperature!) However, temperature ANOMALIES turn out to be correlated over quite large regions. I.e., if we have a cold month here in Rochester, it is also likely to be colder than average in, say, Buffalo and Syracuse and the Adirondack Mountains…and, to a slightly lesser degree, even more distant places like New York City, Boston, Pittsburgh, Philadelphia, and Cleveland.

[quote]Joel Shore (16:22:06) :
I am not sure how you get your intuition. What is your intuition for what would happen to you if you were placed in a room with 0.038% of plutonium?[\quote]

Are you insane? You’re comparing X-rays to microwave. Get a life. Get an EM spectrum chart. You obviously don’t have a clue. (Sorry, mod, I can’t take much more of this ignorance.)

***I*** don’t have a clue? I won’t even start trying to dissect what is wrong with your statement. I will merely point out that I was giving an example of why one cannot simply use intuition to say, “Boy, 0.038% is such a small quantity, it can’t possibly have an effect.” I was not yet explaining why, in the particular physically-relevant case of CO2 and the absorption of infrared radiation, the 0.038% can have a significant effect.

[quote]deech56 (16:49:32) :
Really? Most of the atmosphere (O2 and N2) is transparent to infrared radiation,[\quote]

Demonstrably untrue by anyone with any knowledge of AAS/AES, unless rephrased to identify the specific wavelength ranges you are discussing. And the ranges had better be really d**m tight.

I have no clue what you are trying to say here…but the fact is that O2 and N2 in the atmosphere have an essentially negligible amount of absorption of infrared radiation emitted at terrestrial temperatures.

What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one?

What you can say, and this is one of the key reasons I gave up on AGW, is that there’s so much CO2 in the atmosphere that adding more has much less impact than it did before it became nearly opaque.

…Which just demonstrates that you have not understood the basic physics of the greenhouse effect. See here http://www.aip.org/history/climate/simple.htm#L_0623 for a further explanation in an historical context, but the basic point is not whether or not a single absorption occurs. Rather, what you have is multiple absorptions and emission and what ends up being relevant is the temperature from which most of the radiation is emitted that can then escape into space (without further absorption). Increasing CO2 increases the height of this effective emitting layer and, since the tropospheric temperatures decrease with height, that means the layer from which that emission occurs is colder. However, since the intensity of emission goes as temperature to the fourth power, this means that less energy is being emitted than is being absorbed from the sun and the earth responds by warming until radiative balance is restored.

Some comment on : If It’s That Warm, How Come It’s So Damned Cold?
James Hansen, Reto Ruedy, Makiko Sato, Ken Lo, link above in lead comments.

This seems to have been written during the current NH cold spell. Since conditions are so obviously not what they’ve been predicting you would think that a comment with a title like this would be their best shot. Well, I find nothing convincing here at all. In fact, they hardly seem to be trying.

They say ”There is a contradiction between the observed continued warming trend and popular perceptions about climate trends. Frequent statements include: “There has been global cooling over the past decade.” ”

The main reason for this (they say) is that HdCRUT has 1998 as their hottest year, while GISS has that as 2005. Investigating why the difference, they show that if they adopt the same gridding as HadCRUT they get the same result (their Figure 4; warmest 1998, slight cooling since then). They don’t seem to notice that what they demonstrating is that the whole of the rise that they claim comes not from the actual data but from the interpolated “data”. Remove the interpolation, remove the warming. In passing, they also demonstrate that the interpolation is in no way necessary to get a result. They can do it without, they just get the wrong answer.

In attemting to justify their methods, they say that with interpolation
“the patterns of warm and cool regions have realistic‐looking meteorological patterns”, not bothering to say what it looks like without (the actual observations are maybe “unrealistic”?).

They acknowledge that this is qualitative support, and continue: ” One way to estimate …uncertainty, or possible error, can be obtained via use of the complete time series of global surface temperature data generated by a global climate model that has been demonstrated to have REALISTIC spatial and temporal variability of surface temperature.[my caps]” No doubt this REALISTIC means the the model output agrees with the assumptions that underlie their interpolation algorithm. And yes, the find the “errors” are very small.

What you can say, and this is one of the key reasons I gave up on AGW, is that there’s so much CO2 in the atmosphere that adding more has much less impact than it did before it became nearly opaque.

…Which just demonstrates that you have not understood the basic physics of the greenhouse effect. See here http://www.aip.org/history/climate/simple.htm#L_0623 for a further explanation in an historical context, but the basic point is not whether or not a single absorption occurs. Rather, what you have is multiple absorptions and emission and what ends up being relevant is the temperature from which most of the radiation is emitted that can then escape into space (without further absorption).

One problem I have with the historical contexts people like to fall back on, is that I like to think science has progressed since the days of Hershel (solar activity and climate), Tyndall, and Arrhenius. In particular, the CO2 studies neglected convection, and while the CO2 blanket slows down some of the IR transport, convection provides another means of transporting heat upward. The more CO2 slows down IR radiation, the more important other processes become, and that simply wasn’t examined. As far as I know, it may still not be well examined or understood.

While appeals to century-old science may help bolster arguments about the science being settled, it also implies that temperature should be tracking the Keeling CO2 curve, but it’s not.

Getting back to my original comment – are you suggesting that adding 100 ppm to current CO2 levels will have the same impact that the first 100 ppm did?

A machinist would say that the C02 concentration in the atmosphere is .38 of a thousandth.
Try opening your dial caliper to 1/3 of a thousandth. Takes some practice and due diligence to deal with the backlash.

That’s what micrometers are for. Hope it’s a good one with carbide faces. Set the four-tenths on the Vernier marks, you can eyeball it just under for .38, flick the spindle lock. If really fussy you can check the gap on an optical comparator.

And it better be a regular type. Electronic digitals are cute with the implied precision, but just try getting the correct feel for them to repeat that very last digit every time. And the mechanical digitals are only good until you drop them one time too hard and/or one time too many. ;)

Of course, machinists know that four-tenths can be critical, even one tenth can be important, especially on tight tolerances like small holes. So they likely wouldn’t say that. One good fart spread throughout the shop, now that would be a good description.

It’s very interesting the way certain factions of climate science leave unfinished threads and unresolved issues. One such “hole” in the science is
peterson’s postulate ( his 2003 paper on UHI) where he postulates ( his word) that temperature stations should be located in “cool parks,” That seemed a fair “postulate” that was begging for on site verification. With Surfacestations at over 1000 sites visited, perhaps we can say something definitive about this ‘cool park’ postulate.

10.) It never was that warm, you have been spotted with Heide DeKline.
9.) It hasn’t been that warm in 10 years; Heidi dumped you for Frosty.
8.) The cold has to gain momentum before it drags the warm down with it kicking & screaming. (no snickering !)
7.) That’s the last time we let you bring burritos on the Polar Expedition.
6.) Hey, you made this stuff up, you tell me.
5.) This box of Frosted Climate Charms is packed by weight, not volume, some settling of science may occur.
4.) You’ve got a fever. Remember that Twilight Zone with Mrs. Bronson?
3.) Drinking in the sauna again, I see.
2.) Ask the guy with the pitchfork. He says he’s come to collect on a debt.
1.) It’s official: Hell is frozen over.

I have been following your postings for a couple of months and I am obviously the reason for the huge increase in the number of hits that you have been having. Well done you!

I am not a meteorologist or a scientist. I am not a denier or an alarmist. But I do like to reduce our collective use of resources – be it oil or coal or gas or gold.

I was particularly interested in two recent posts “Coal Creek Redux” and “Darwin Zero”. I have been looking at some temperature data from Macau (or Macao if you prefer). Don’t ask why but I was.

For those that don’t know, Macau is a very small port city on the southern coast of China (http://en.wikipedia.org/wiki/Macau but use with the usual caution about wiki). It used to be a Portuguese colony until recently and they have weather records that go back quite a long way. I have no special knowledge about the weather recording station there – it may be that it was a mercury thermometer hanging out of somebody’s window for all I know – but I doubt it. Macau is a very small place and I suspect that the weather station was in the old Portuguese fort (probably for all of its life) and it was probably read religiously from when it was set up. Anyway, cutting to the chase, the Macau meteorological service have a website which lists a hundred years of temperature data 1901-2000 (see http://www.smg.gov.mo/www/smg/centenary2/index_four.htm ). They have more data, as well, going forward and back but not as easily accessible. I thought that I’d take a look at it. It is not in an easy database format but with a bit of effort I was able to export the mean temperature data, year at a time, on a daily basis. The advantage appeared to me that with daily means, minima and maxima the temperatures were less likely to have been ‘adjusted’. I thought, well does this show any warming or cooling over the century?

Most of the analyses that I have seen elsewhere present an annual mean temperature and see what the trend is over the period. I did this for Macau. See below.

What’s up with that? Well in this case we have 55 annual points covering the 100 year period (because I have not completed grabbing the data). Each of these points represents the mean value from 365 (or 366 in a leap year) daily mean temperature values for the year in question. The trend line, using a linear regression, provides an estimate for the mean temperature in 1900 of 22.28°C and in 2000 of 22.62°C. Thus, we have an estimated warming of 0.34°C in the century from 1900 to 2000.

As you know, each time that you get a mean or average value of a variable and state it by itself, you lose some of the information that lies behind the mean. Thus a yearly mean temperature loses a lot of information about the variability within that year. I thought that getting the mean monthly temperatures might be a second estimate – and this is what the plot looks like for the year’s that I have grabbed so far. I’m sure somebody with more time could grab the years that I have not got so far (1906 to 1910, etc)

What’s this mean? If taken at face value, it looks as if the mean of the ‘mean monthly temperatures’ averaged month at a time was about 22.4°C. It also appears to mean that the trend of increase in the mean temperature is 0.0003°C per month. (We need to be a bit careful because I’ve used Microsoft Excel and Excel does some strange things when you use dates and converts them to numbers – starting at 1 January 1900). This gives an estimated potential increase of 0.39°C for the century from 1900 to 2000.

Hold on a second though, this is giving equal weight to every month in the year. Are the very hot months and the very cold months skewing the data? Another way of looking at the same data is to check what happens in each quinquennium. See below:

In this case, the slopes of the trend lines are probably not relevant since they are only taken over a short time period of five years. We can see, however, that there is no large shift in the curves from the period 1901-1905 to the period 1996-2000. We do get estimates of 22.49°C for the first quinquennium and 22.71°C for the last quinquennium of the century. This represents an estimated warming trend of 0.22°C (over 95 years) or 0.23°C over 100 years.

Another way of looking at the same data is to first get the mean values for each month – we know that there is a nearly periodic cycle of about 12 months which affects the temperature. If we take the average for each month over the 100 years then (so far) we get the following average annual cycle:

We can use these monthly means to get the deviations in each year from the monthly means and therefore de-trend the monthly data for the effect of the annual cycle.

This de-trended data, using 660 monthly data points (out of an available 1200 data points for 100 years) provides a very slight estimated warming trend of 0.03°C per century. You can see that the spread of data is roughly in the range ±5°C and there do not appear to be ‘outliers’ of large magnitude in the data set. However, it might be as well to look at the distribution of the de-trended data. See below:

The distribution of the de-trended data is shown on the histogram. Also plotted is a Normal distribution for comparison. The data looks pretty normal and the extreme values – the hottest and coldest months in the century do not look outstandingly hot or cold – the sort of values that might be expected.

Of course, we can do a further check on this – and I suggest that somebody with more time could do so. This is to grab to maximum and minimum temperature data sets on a daily basis and check that the maximum monthly temperatures and the minimum monthly temperatures always dance above and below the mean monthly temperatures respectively. Another advantage of checking the maxima and minima would be that one could see whether the maxima are getting larger (hot getting hotter) or the minima are getting smaller (cold getting colder) or vice versa. This would tell us more about why there is a warming or cooling trend – if indeed there is such a trend.

(Just as an aside, there seems to be a tendency for the modern environmental scientists to discard ‘outliers’ just because they are outliers. Clearly, there is case for do so if, for instance, somebody has misread the original handwritten record and has read ‘54’ instead of ‘34’, but I think one should only discard outliers if one can see why – and then tell the world which outliers have been discarded and why.)

Oh, I forgot to mention, Macau used to be a very small place indeed as far as population is concerned. In recent years its population has grown quite considerably. According the World Bank, Macau had a population of 172,608 in 1960 and 526,178 in 2008 (http://www.google.com/publicdata?ds=wb-wdi&met=sp_pop_totl&idim=country:MAC&q=macau+population ). I do not know what the population was in 1900, but I would guess at less than 50,000 and maybe much fewer. Should there be a sign of an UHI (urban heat island) here? I suspect not, but that may be for others to look at.

What’s the point of all this? Well I think the first lesson that it teaches is that for each temperature reading station we need to have some history. Where was the temperature read and how? Has the temperature reading or system been changed over time? These are some of the points that were being made in the post about Darwin. Is the station data from a consistent source and is it reliable? (By the way, Willis might want to look at temperature data from nearby Indonesian islands such as Bali to compare with Darwin – probably more relevant than Alice Springs.) The second lesson is that each station needs to be analysed in detail before it is added to the global database. The third lesson is that using a linear regression trend to project forward may or may not be appropriate. Are there any other trends within the data which can be decomposed and thus eliminated from the data? Is Macau relevant in any way as a station to search for global warming? Search me!

I have not completed the analysis of the Macau data – just started in fact – but you can get different answers depending upon the analysis – particularly if you have a small number of data points.

As you can see, I’ve just done what I’m suggesting that nobody should do – presenting data that is incomplete and only partially analysed!

Cheers

Bob the Builder

PS You can see that I’m an amateur and can’t submit the charts that go with this post. Let me know how and I’ll send them separately.

What is this, a coordinated attack? Please explain in simple terms, in the same manner in which you whack jobs (sorry, that may not include you personally) always do, what exactly it is that makes one frequency of the EM spectrum more important than any other one? Is it your imaginary “windows”?

I would guess that all the “whack jobs” have their physics textbooks open to the same chapter. Just think of it as independent confirmation of results. ;-)

Cannot we examine the pCO2-Temp-Time space as indicated from the Epeica Dome C ice cores to get a more empirical indication on the GHG effects of CO2 effect at concentrations above 280ppmv? Doesn’t the pCO2-Temp-Time space indicate that as earth’s mean surface T’s rise after each glacial epoch, pCO2 then rises 400 to 1200 years AFTER the T rise – through ocean CO2 degassing as the mean surface T’s rise? And conversely, as global mean surface T’s fall at the onset of the next glacial epoch, then pCO2 falls 800 to 2000 years AFTER the T fall – through ocean solubilisation as surface T’s cool. Remember that this pCO2-Temp-Time space operates at about 280 ppmv CO2. So how can CO2 be acting as a significant greenhouse gas at these concentrations, and higher, noting that its concentration in the atmosphere FOLLOWS global mean surface temperature excursions? This supports the conclusion that adding more CO2 , above about 20ppmv, does not cause an enhanced greenhouse gas effect as asserted, and indeed it just that, an assertion, by the “AGW modellers”- in other words, the pCO2-Temp-Time space defined from a study of the ice core is indeed consistent with empirical observations on the logarithmic drop off in CO2 infrared absorbance effect as its concentration increases.

One problem I have with the historical contexts people like to fall back on, is that I like to think science has progressed since the days of Hershel (solar activity and climate), Tyndall, and Arrhenius. In particular, the CO2 studies neglected convection, and while the CO2 blanket slows down some of the IR transport, convection provides another means of transporting heat upward. The more CO2 slows down IR radiation, the more important other processes become, and that simply wasn’t examined. As far as I know, it may still not be well examined or understood.

Ric, you raise an interesting point, but we should look on these earlier studies as the building blocks of the later studies. They are incomplete in the same way that Watson and Crick (1953) was incomplete. (You might also note that their earliest papers had an error in the number of hydrogen bonds between C and G.) On a related note, Robert Grumbine wrote an interesting post about successive approximations. Reading the climate literature, especially the idea of climate sensitivity, is a study in the progress scientists have made in their understanding of climate.

Joel Shore (16:22:06) : Since the forcings on which Scenario A was based didn’t come to pass, […]and, in fact, the forcings used for Scenario B turn out to be quite close (although apparently still a little bit high) of what came to pass.

Sorry, I’m having a bit of trouble finding “forcing” in my physics book, what S.I. units is a “forcing” measured in? Can’t use it in a calculation if you don’t know what its units mean, after all. So is it degrees / day or delta degrees / day or ergs per fortnight or what?

Joel Shore (19:35:43) : And, they do NOT estimate temperatures from temperatures in adjacent grids. They estimate temperature ANOMALIES in this way.

Ah, yes, the old “Use the Anomaly Luke, the Anomaly will save us!” approach.

Well, your first problem is that you have to have a temperature in that adjacent “grid” (actually, grid box) before you can calculate the anomaly. One heck of a lot of these are simply “made up” from “nearby” station data up to 1000 km away. Real data in the baseline, made up in the comparison. Viola! Anomaly! (Works for Bolivia, all of it, as well as the Canadian Arctic – almost all of it, except for “the garden spot of the arctic” where a thermometer was left in Eureka…) and many many more.

But GIStemp calculates its UHI before the anomaly map step, and does it using interpolated temperatures from up to 1000 km away. And it does infill from up to 1000 km away using temperatures. And it homogenizes using temperatures And … But yes, it does average an entire 10 of them together first, so while the code calls this an ‘offset’ you can feel free to call it an anomaly if you like. I certainly think the GIStemp UHI calculation is an anomaly. After all, it gets the sign wrong about 1/4 to 1/3 of the time…

The two are very different beasts: Temperatures are not strongly correlated over distance. (In fact, if you consider a mountainous region like the top of Mt. Washington and the valley only a few miles away, you can see how a few miles can make a huge difference in surface temperature!)

Yes you can. Well, actually, no you can’t. Since mountain tops have been with near religious zeal removed from GHCN by NOAA / NCDC you can hardly find the poor dears in the basic data set that underpins all of NCDC adjusted, GIStemp, and HadCRUT data series. The Andes are gone. The Canadian Rockies are gone. Poor Japan, flat as Kansas. No thermometer over 300 m elevation. It would be nice to be able to see the difference between cold places on mountains and everywhere warmer down lower, but we can’t. Well, you can, but only if you look back in the baseline for those anomalies where the cold locations are left in and then compare them to the present where they are taken out.

Oh, I’m sorry, that WAS your point! And here I missed it. Those wonderful Anomaly Maps that are all rosy red are there to show us what it looks like when you compare a cold mountain baseline with a warm valley / beach present, and I didn’t realize it. Silly me. They are just doing it for educational purposes.
However, temperature ANOMALIES turn out to be correlated over quite large regions. I.e., if we have a cold month here in Rochester, it is also likely to be colder than average in, say, Buffalo and Syracuse and the Adirondack Mountains…

And if we have a hot patch of tarmac at Diego Garcia in the tropical sun in the Jet exhaust, we are likely to have a cold patch of ocean water 1200 km away out to sea, but no worries, we’ll just take that Island Tarmac temp and fill in that 2400 km diameter circle of ocean with it, since it’s just an anomaly after all… (Yes, that is EXACTLY what GIStemp does.)

BTW, think on this: Lodi vs. San Francisco. They have a nice positive correlation throughout most of the year. Then comes summer. When it gets hot in Lodi ( like 100+F or 40 C) the valley air heats up and rises. Pulls the fog blanket in over SFO and cools them down nicely (like 50 F) . So you compute your ‘average offset’ from SFO (where NCDC in GHCN has a thermometer) and then use that to fill in LODI, which will even work for part of the year, sort of. Until summer. Then LODI will be given a way wrong number. But don’t worry, when LODI is stuck under valley tulie fog in winter, it’s way off the other way. So these two errors might accidentally offset. Maybe.

My favorite, though, is Pisa Italy where the UHI correction is 1.4 C in the wrong direction because they reference to a site in the German approach to the Alps as a “nearby” station…

Oh, and in the Pacific Basin 100 % of reporting stations are scheduled to be Airports real soon now (they almost are already). Good Luck finding a ‘rural reference’ for detrending that UHI. (Oh, so sorry, I forgot, major airports are flagged as ‘rural’ in GHCN so you can just use them to correct each other… kind of like the Marine Base at Quantico Virginia is classed as rural. You know, near the air strip where right about know I’d expect near continuous air ops what with supporting troops in 2 theatres along with a disaster relief op in Haiti) I’m sure there will be No Problems At All getting a nice clean red anomaly out of them…

/sarcoff>

I’m still looking for the real beef in this pile of anomaly bull and all I find are a load of hypothetical cows.

I’ll believe Hansen and his code when:

1) The QA data and suite are presented along with the QA run output.

2) They put the thermometers back in AND NOTHING CHANGES. (Oh, and put back in the SAME data, please. Not like that USHCN hack, where they left out the USA thermometers from May 2007 to Nov 2009, but finally put them back in; but only AFTER cooking the data some more so USHCN.Version2 has warming baked in that matches GIStemp)

3) A full benchmark suite and benchmark data are provided that measures and demonstrates exactly what the program does with representative data, white noise, red noise, biased hot series, biased cold series, and dead flat series. That will let you calculate the filter “Q” of GIStemp. A fundamental, and lacking, specification.

Until then, all you are saying is “GIStemp is a 100% perfect filter and can remove horrid levels of data bias with absolute perfection”. And since that isn’t possible, it’s a void statement. You must measure the Q of a filter, not assume it nor fantasize about it.

Cannot we examine the pCO2-Temp-Time space as indicated from the Epeica Dome C ice cores to get a more empirical indication on the GHG effects of CO2 effect at concentrations above 280ppmv? Doesn’t the pCO2-Temp-Time space indicate that as earth’s mean surface T’s rise after each glacial epoch, pCO2 then rises 400 to 1200 years AFTER the T rise – through ocean CO2 degassing as the mean surface T’s rise?

I’m not Joel, but I will make a couple of points.

1. The change in CO2 occurs about 800 years after the start of the temperature change, but continues throughout the warming and cooling phases.

2. The forcing from Milankovitch cycles cannot account for the temperature change without the inclusion of feedbacks – and we know that CO2 is a greenhouse gas and as CO2 levels increase, they feed back on the earth’s heat balance.

3. Calculations of climate sensitivity have included the analysis of the last glacial maximum, and the number that comes up (1.2–4.3 oC) is close to the IPCC estimates. Of course, compared to the last million years plus, we are in uncharted territory in a way, so it is useful to look back even further, at a study in which the authors found a range of 1.6-5.5 oC; again, consistent with the IPCC estimates.

C’mon how do you get around the logarithmic effect of the absorption of CO2 and any other gas!!! Adding CO2 at this point has zero effect on its absorbance!

Not zero, but diminishing and smaller than other effects. With a negative PDO and quiet Sun, we stand a chance of sorting things out better. Okay, bizarre Sun. That’s an extra variable to learn about and deal with.

The logarithmic effect deals mainly with the sides of the absorbance windows where some additional blocking occurs. While there are physical reasons for it, the effect is more a curve fitting approximation and didn’t apply well at low concentrations and won’t at higher (I think much higher).

Peter Miller (15:51:24) :
As a geologist and a scientist, what I don’t understand is how changes in a trace gas (carbon dioxide), which has a concentration of less than one part in 2,500 in our atmosphere, can have more than a minuscule effect on our climate.

I accept the fact that carbon dioxide, like water vapour, is a ‘greenhouse gas’ when measured in percentage terms – but 0.038% – that’s like putting a magazine in the attic of your home and expecting the whole building to heat up.

Because CO2 is a very strong absorber of the IR radiation which cools the earth, even at a concentration of 388 ppm. To address your personal incredulity here’s an image of a flask of iodine gas at about 388ppm, imagine trying to look through a mile of that! Don’t try to repeat the experiment at home since the vapor pressure of iodine at 25ºC is 394 ppm which is a couple of hundred times higher than the Immediately Dangerous to Life and Health level!

C’mon how do you get around the logarithmic effect of the absorption of CO2 and any other gas!!! Adding CO2 at this point has zero effect on its absorbance!

Not zero, but diminishing and smaller than other effects. With a negative PDO and quiet Sun, we stand a chance of sorting things out better. Okay, bizarre Sun. That’s an extra variable to learn about and deal with.

The logarithmic effect deals mainly with the sides of the absorbance windows where some additional blocking occurs. While there are physical reasons for it, the effect is more a curve fitting approximation and didn’t apply well at low concentrations and won’t at higher (I think much higher).

Let us not forget the use of FILTEMP, where missing data is homogenized in.
There is the question of if the data was actually not taken, or simply removed as being non-convenient. In either case, the desired data giving the model result can be ‘filled in’.
There are many tools in the fabrication box.

I know Anthony is looking at stations in the US. Is anyone looking at the UK stations? I only ask because one of the stations is little old Hurn, now better known as Bournemouth International Airport (I love the International bit). Presumably, there will be some anomolies there over time given that the airport has expanded.

If anyone is looking at the UK stations, I’d be happy to do a reccie at Hurn and report back.

This is such a simple case to figure. Long-term weather variations such as can be experienced with positive or negative oscillations have stronger or weaker affects on topography depending on address. For example, desert climates have higher highs and lower lows because of topography interacting with weather parameters. So when weather is extreme, so will go the desert. Clearly, topography interacts with weather.

If the experimental plots (the temperature sensors) in Hansen’s research design were reduced over time in any biased topographical or urban heat island way (which we know is the case), the results will be biased, not because of CO2 but because of experimental design changes that are biased. Until he can provide an experimental design that is clean, he absolutely cannot say that CO2 is the cause of the warming he finds. To wit: the correlation of increased temperature with decreased sensors is as strong as the correlation between increased temperature and increased CO2, with mechanisms demonstrated for both cases. So which one is it?

Hansen will eventually have to answer for his use of fiddled with data and ignored inappropriate and unexplained experimental design changes, IE sensor drop out and various code manipulations for the non-random nature of his data.

In summary, this is a plot experiment – a very simple research design that is hard to mess up if done right, and as such Hansen’s is quite possibly the worst one I have ever seen. That it gets any kind of journal press is beyond me. No statistician I know of would touch this if such data were brought to him or her for analysis. It is not even worthy of a passing score for a 5th grade plot experiment.

“A later decision made Crater Glacier the official glacier name. Despite the volcanic activity, the glacier continued to advance and by mid-2008, the glacier completely encircled the lava domes.[2][3][4][5] In addition, new glaciers (rock or ice) have formed around Crater Glacier as well.” Inside Mount Saint Helens…

Perito Moreno Glacier in the SH, also advancing… Many glaciers are advancing… and with the increased snowfall, more will be advancing. Some people say that we are losing glaciers. Nothing could be further fro the truth. They are not lost. The water is still here on the earth and as the earth goes through it’s cycles, glaciers will increase, as some are already doing. Where is the study that compares all or even most of the glaciers on earth to their mass as of five years ago. So as I have cherrypicked a couple of glaciers, you can as easily cherrypick some that have declined… I suggest we not use this very small amount of earth’s ice to try to prove anything.

Ah, another Wood For Trees homemade chart. I like the WFT site. It’s fun to play around with, because you can show anything at all, depending on your initial conditions and time frames.

And I understand that when people get beaten over the head with alarming propaganda 24/7/365, it has an effect. Unless folks really look into it beyond the P.R. pronouncements of the IPCC, HadCRU, GISS, NOAA, etc., they begin to believe the ‘adjusted’ numbers.

I’m here to help. Keep in mind that everyone is being spoon fed adjusted numbers by people with an agenda, who stand to financially gain from the AGW scare.

The satellite data below are the most accurate; the rest are questionable. So, are you ready to see what’s really going on? OK, get your mouse clicker ready:

As you feverishly look for one or two things to nitpick about, keep in mind that those are not my charts, fabricated any old which-way on the WFT site. They are taken from a wide range of sources. And they all show why skeptics are skeptical of the CO2=CAGW claim.

Yes, the planet has been warming – naturally – in fits and starts since the LIA, and since the last great Ice Age before that. That is the basis for the long established theory of natural climate variability – the theory that must be falsified if the CO2=CAGW hypothesis is to replace it.

But as climatologist Roy Spencer says, “No one has falsified the theory that the observed temperature changes are a consequence of natural variability.” Skeptics have nothing to prove. The burden is entirely on the alarmist contingent to falsify natural climate variability. So far they have failed.

CO2 has an unmeasurably small effect on temperature. When the alarmist crowd can empirically demonstrate, through testable, repeatable, and falsifiable measurements, that X amount of CO2 results in Y amount of temperature rise… wake me.

If the effect of CO2 was significant, the globe would be rapidly warming, as harmless, beneficial CO2 rapidly rises. But it’s not, it’s cooling – indicating that the effect of CO2 is minuscule and overstated, and is overwhelmed by many other factors.

Finally with regard to glaciers, there are over 160,000 glaciers on the planet. They are the world’s easiest things to cherry pick. Yes, most glaciers are currently receding. But CO2 is not the cause. I refer you to the WUWT thread on glaciers. Read & learn.

Perito Moreno Glacier in the SH, also advancing… Many glaciers are advancing… and with the increased snowfall, more will be advancing. Some people say that we are losing glaciers. Nothing could be further fro the truth. They are not lost. The water is still here on the earth and as the earth goes through it’s cycles, glaciers will increase, as some are already doing. Where is the study that compares all or even most of the glaciers on earth to their mass as of five years ago. So as I have cherrypicked a couple of glaciers, you can as easily cherrypick some that have declined… I suggest we not use this very small amount of earth’s ice to try to prove anything.

Thanks for taking the time to post those links, Smokey. I particularly liked the “new calculus” of the IPCC, attempting to show that the rate of increase is accelerating – unbelievable that people are falling for this.

Thanks for putting up all the charts, but I notice that most of the ones with trendlines start in 1997; some do not even go to the present. Besides the fact that the satellite data are an indirect measurement and is most useful for the analysis of temperatures at different atmospheric depths, you know that 10-12 years is to short a time to define a multi-decade trend. And the UAH chart from the beginning of the record (this one) that you posted does not have a trendline (you can’t make a conclusion based on one “May 09” point). I put the same data into WFT and came up with this. Looks like a trend of about 0.17&degC/decade, which isn’t that far off from the GISTEMP results.

As far as the “financial gain” and propaganda stuff, please spare me; I am a scientist in another field (I’ve done work in animal models, HIV and vaccines, so I know contentious) and these arguments mean little to me. Show me the publications to support your points. Those are the standards I use in my own work, and I see no reason not to apply those same standards to climate science.

This is the same UN that initiated the Himalayan Glacier Melt fiasco??
Please tell me you don’t think they have a shred of credibility left. Also, at the end of the report they ask for much, much more to handle increased monitoring of the earth’s glaciers. Why do they need increased monitoring? Because the job is not being done well enough now.
Have you ever wondered why NASA hasn’t released satellite photos of even half the glaciers from five years ago and from today? I think I have an idea why that info isn’t available. Sad that their increasing politicization has hurt their credibility as well.
The glaciers contain less than 1% of the earth’s ice. Believe me that you don’t want that percentage to increase substantially.

1. The change in CO2 occurs about 800 years after the start of the temperature change, but continues throughout the warming and cooling phases.

2. The forcing from Milankovitch cycles cannot account for the temperature change without the inclusion of feedbacks – and we know that CO2 is a greenhouse gas and as CO2 levels increase, they feed back on the earth’s heat balance.

3. Calculations of climate sensitivity have included the analysis of the last glacial maximum, and the number that comes up (1.2–4.3 oC) is close to the IPCC estimates. Of course, compared to the last million years plus, we are in uncharted territory in a way, so it is useful to look back even further, at a study in which the authors found a range of 1.6-5.5 oC; again, consistent with the IPCC estimates…

Deech 56, be sure to continue your conclusions by noting that CO2 concentration then falls 400 to 1500 years AFTER T decrease at the onset of glaciation!!

And continue and note that during the Eemian glaciation, global mean surface temperatures were 3 to 5oC HIGHER than now, but with CO2 at 280ppmv. So here we have an scenario where T’s were 3-5oC higher than now with CO2 100ppmv LESS than now. So what drove these higher T’s during the Eemian Interglacial?

So we “know” that CO2 is a greenhouse gas, but its effect in driving T changes at concentrations of 280ppmv and above are tiny – the data speaks for itself. And, why don’t you mention the role of clouds vs CO2 in your feedback models?

rbateman, you can view CDO NCDC data for free from a .edu, .k12, or .gov domain. It is apparently done that way so that people don’t use the data commercially (without breaking a rule of their provider). All of the weather stations use NCDC data. This saves taxpayer money.

This is the same UN that initiated the Himalayan Glacier Melt fiasco??
Please tell me you don’t think they have a shred of credibility left. Also, at the end of the report they ask for much, much more to handle increased monitoring of the earth’s glaciers. Why do they need increased monitoring? Because the job is not being done well enough now.

Have you ever wondered why NASA hasn’t released satellite photos of even half the glaciers from five years ago and from today? I think I have an idea why that info isn’t available. Sad that their increasing politicization has hurt their credibility as well.

The glaciers contain less than 1% of the earth’s ice. Believe me that you don’t want that percentage to increase substantially.

I did ask for publications to substantiate your points, but apparently none are forthcoming. Crying out “UN! UN!” means nothing to me. Those of us in medical research have a positive image of their scientific missions, thanks to the WHO. And what’s wrong with wanting more data? There appears to some uncertainty about the Himalayan glaciers, so more analysis is warranted.

Again, show me where the WGMS is wrong. Find the papers that state that glaciers world-wide are advancing.

He writes, on p. 28: ” Figures 9a-9f show records of glaciers in Alaska, New Zealand, the European Alps, and the Himalayas, respectively, which have been receding from the time of the earliest records, about 1800. … It is clear that the retreat is not a phenomenon that began only in recent years, or after CO2 emission increase in1946.”

And on page 32: ” Altogether, long-term glacier data presented here show that glaciers advanced from about 1400 and began to retreat after 1800 (cf. Akasofu, 2008). These facts confirm that the Earth experienced the LIA.”

“Glaciers around the world are retreating – what might be the reason for that?”

As we just learned on another thread there are 162,000 glaciers and the majority are not monitored. We also know that many glaciers are, in fact, advancing. We also suspect based on ClimateGate, etc. that most of those studying glaciers need to claim AGW just to get published.

The net result is I think we know a lot less than you think.

BTW, I think it has warmed since the depth of the LIA. Personally, I am quite happy about it. Would you like to see a colder world with less farm land, lower crop yields and more cold winter related deaths?

In general I find people in medical research to be quite skeptical of AGW. The medical research community proved oh so clearly the problems with research bias many years ago. Hence, the adherence to double blind studies. Clearly, there is no equivalent in climate research. Bias is rampant and anyone with any sense of medical research history should be very, very skeptical.

You should be damn mad that “global warming studies” have already drained tens of billions of dollars away from other, more important research – like the field you work in. The global warming scam is a monumental rip-off of taxpayer funds, and it starves all other areas of publicly funded science.

There are only so many tax dollars available to go around. That is why the promoters of AGW run and hide from any real debate. That’s why Al Gore and the rest duck out of interviews [unless the person doing the interview is a trusted pet]. And that is why there are no true scientific skeptics in the whole bunch. The climate alarmists have both front feet in the public trough; they are not going to derail their gravy train by answering questions.

You asked for publications. OK. I can provide several dozen if you like in the next few minutes. But since the central aspect of the CO2=CAGW hypothesis concerns the question of climate sensitivity, I’ll provide one that answers that question [Prof Richard Lindzen arrives at roughly the same answer, although he is a little more conservative. But he’s in the same ball park].

To give some background on the one I’m providing, it came about due to the persistence of Lord Monckton, who ultimately prevailed against the editorial board of the American Physical Society – which had tried to shut him out of the debate.

By finally publishing his paper, Lord Monckton forced them to acknowledge his rationale. [You can see from their intro that they printed his corrigendum with gritted teeth; they could have simply called for Monckton’s paper to be refereed and peer reviewed, but apparently decided that it would be in their best interest to not give it that status.]

Monckton methodically shows that the sensitivity to CO2 is too low to result in anything but very slight warming, and for all practical purposes CO2 can be disregarded: click

Leo G (13:39:36),

I am at a little bit of a loss regarding your question about forcings for the same reason that E. M. Smith questions them: how are “forcings” quantified? Electrical current is measured in amperes, resistance is in ohms, etc. What is the basic unit of forcing? The term sounds like inside baseball to me. Sorry for the non-answer. But first we need to know what quantity a forcing is.

What do your sources that discuss the long-term retreat tell you? And was Dr. Akasofu’s paper published in a journal? I would be interested in reading the abstract before spending the time downloading the full paper.

As we just learned on another thread there are 162,000 glaciers and the majority are not monitored. We also know that many glaciers are, in fact, advancing. We also suspect based on ClimateGate, etc. that most of those studying glaciers need to claim AGW just to get published.

The net result is I think we know a lot less than you think.

BTW, I think it has warmed since the depth of the LIA. Personally, I am quite happy about it. Would you like to see a colder world with less farm land, lower crop yields and more cold winter related deaths?

Any population study involves taking a sample. Here are the locations of the glaciers that were studied for the mass balance calculations. They represent a broad range of regions. Do you think that the sampling that started decades ago just happened to pick glaciers that were going to melt? Any evidence for this? You “suspect” that the glacier papers need to claim AGW to get published? Where was this in the stolen e-mails? The choice isn’t between a colder world or a warmer world. The choice is between the world we have become accustomed to and a world (and country) far, far different. I’m from Buffalo, and cold world doesn’t frighten me. ;-) (Leo G wil like this)

Increased water and agricultural stresses and risks to coastlines don’t sound too pleasant to me. Food production is pretty important.

In general I find people in medical research to be quite skeptical of AGW. The medical research community proved oh so clearly the problems with research bias many years ago. Hence, the adherence to double blind studies. Clearly, there is no equivalent in climate research. Bias is rampant and anyone with any sense of medical research history should be very, very skeptical.

My experience reading the writings of HIV/AIDS and vaccine deniers has led me to be skeptical of “skeptical” claims against the science. Double blind studies are important, but that’s because of the placebo effect; at my end of research double-blind studies weren’t really feasible. That doesn’t mean that we didn’t consider bias and try to minimize it, of course.

But the point that I am trying to make is that I approach a field outside of my own area of expertise the same way that I approach any scientific research – looking for the best sources of information, and that is the scientific literature, paying particular attention to any findings by organizations like the National Academy of Sciences.

I had the advantage of being able to manipulate conditions in my experiments. In fields like astronomy climatology, geology and paleontology, researchers depend a lot on observations, but we still accept things like the big bang, age of the earth, and evolution. That’s life, and in science we use the tools that are available.

You should be damn mad that “global warming studies” have already drained tens of billions of dollars away from other, more important research – like the field you work in. The global warming scam is a monumental rip-off of taxpayer funds, and it starves all other areas of publicly funded science.

There are only so many tax dollars available to go around. That is why the promoters of AGW run and hide from any real debate. That’s why Al Gore and the rest duck out of interviews [unless the person doing the interview is a trusted pet]. And that is why there are no true scientific skeptics in the whole bunch. The climate alarmists have both front feet in the public trough; they are not going to derail their gravy train by answering questions.

And you answer with claims about financial gain and propaganda.

You asked for publications. OK. I can provide several dozen if you like in the next few minutes. But since the central aspect of the CO2=CAGW hypothesis concerns the question of climate sensitivity, I’ll provide one that answers that question [Prof Richard Lindzen arrives at roughly the same answer, although he is a little more conservative. But he’s in the same ball park].

To give some background on the one I’m providing, it came about due to the persistence of Lord Monckton, who ultimately prevailed against the editorial board of the American Physical Society – which had tried to shut him out of the debate.

By finally publishing his paper…

Spencer did provide a decent analysis pointing out the flaws in Lindzen and Choi, but Monckton? C’mon. You don’t actually look to him as an authority, do you? That article in the APS newsletter was in no way, shape or form a peer-reviewed publication. There are many more reliable sources for information on climate sensitivity, like the review paper by Knutti & Hegerl from Nature Geosciences. This paper highlights the idea of independent confirmation (as does James Annan’s analysis, but it’s getting kind of late).

“My experience reading the writings of HIV/AIDS and vaccine deniers has led me to be skeptical of “skeptical” claims against the science.”

You have it exactly backward, my friend. The long established climate theory is that of natural variability. That is what must be falsified, if possible, by the relatively new hypothesis of CO2=CAGW.

The Scientific Method doesn’t only require that an upstart hypothesis must explain reality better than the established theory [usually done by making more accurate predictions, or falsifying the existing theory]; the Scientific Method also requires full and transparent cooperation from those proposing any new new hypothesis.

Why? Because the goal isn’t one-upmanship, or protecting lucrative grant fiefs. The goal is scientific truth, which is arrived at only by attacking and attempting to falsify a hypothesis by skeptical scientists – and even by those scientists proposing the hypothesis. Everyone cooperates in trying to falsify a hypothesis [or a theory, or even a Law – just as the current search for the Higgs boson is intended to verify or falsify the current model of gravity].

Whatever remains standing after all attempts at falsification is considered to be as close to scientific truth as we can get.

Being ‘skeptical of skeptical claims,’ as you said, is the wrong way to look at it. The job of scientific skeptics [which includes all honest scientists] is to falsify a hypotheses if they can, whether the hypothesis is CO2=CAGW, or the theory of natural climate variability. Being skeptical of skeptics has no place in the Scientific Method; it is sophistry.

C’mon how do you get around the logarithmic effect of the absorption of CO2 and any other gas!!! Adding CO2 at this point has zero effect on its absorbance!

A logarithm doesn’t saturate. What a logarithmic dependence means is that you need the same FRACTION of increase to produce the same amount of effect. This is why scientists talk about the effect of a CO2 doubling rather than the effect of raising CO2 by a fixed amount in ppm.

Ric Werme says:

While appeals to century-old science may help bolster arguments about the science being settled, it also implies that temperature should be tracking the Keeling CO2 curve, but it’s not.

No…It implies that over large enough periods of time, the temperature trend should be upward, which it has been. On times short enough that the ups-and-downs associated with ENSO and the like are dominating, the temperature trend is not expected to always be positive. And, in fact, some periods of ten or even 15 years where the least-trend fit is negative are a robust feature of climate models run with increasing greenhouse gases.

Getting back to my original comment – are you suggesting that adding 100 ppm to current CO2 levels will have the same impact that the first 100 ppm did?

No. But, I am suggesting that appeals to “saturation” are bogus, particularly in arguing that this somehow contradicts the mainstream science. The fact that the dependence of forcing on CO2 concentration is approximately logarithmic is well-known, well-understood, and in fact the reason that scientists talk about the effects in the way that they do (in terms of a doubling of concentration).

“My experience reading the writings of HIV/AIDS and vaccine deniers has led me to be skeptical of “skeptical” claims against the science.”

You have it exactly backward, my friend. The long established climate theory is that of natural variability. That is what must be falsified, if possible, by the relatively new hypothesis of CO2=CAGW.

The ‘theory of natural variability’ isn’t a scientific theory since it’s incapable of falsification. On the other hand the theory of greenhouse gas warming is a scientific theory and has met all the challenges so far.

Phil, have you been reading the ENSO update that comes out every Monday? At the end of the power point, predictions of temperature/precip, etc, trends are given for the current ENSO state. If natural variability is not a theory, on what basis do they make these weather predictions?

“Increased water and agricultural stresses and risks to coastlines don’t sound too pleasant to me. Food production is pretty important.”

Then you should be for global warming. Ask any farmer if warm or cold is better? No? You’d rather trust the ClimateGate team? Really? Why is that? The truth is most plants love heat. As long as they have water heat presents absolutely no problems. On top of that, higher CO2 allows plants to get by with less water. Not to mention that plants love more CO2 (it’s called plant food for a reason). The only “agricultural stresses” of increased heat/CO2 are imaginary.

Also, it appears you don’t question any of the AGW articles because you *trust* them as fellow scientists. Actually, that is very unscientific of you. I thought you were on the younger side because that unscientific trust has somehow become commonplace. You will learn eventually. You will find out that this situation is not unlike the 1960s in medical research. Those scientists thought they knew what they were doing as well. I think this is often the problem with new fields of study where researchers have little grasp of what they don’t understand.

“The ‘theory of natural variability’ isn’t a scientific theory since it’s incapable of falsification. On the other hand the theory of greenhouse gas warming is a scientific theory and has met all the challenges so far.”

You’ve got to be kidding. All you need to falsify the natural climate variability theory is show that temperatures have remained constant for all of the Earth’s history. Let us know when you’ve completed the work and we’ll review it for you. Are you related to Mann?

BTW, spouting a strawman is silly. Greenhouse warming theory of 1C per doubling of CO2 is very reasonable and would be much appreciated by all the plants and animals that eat plants.

1. What do your sources that discuss the long-term retreat tell you?
2. And was Dr. Akasofu’s paper published in a journal?
3. I would be interested in reading the abstract before spending the time downloading the full paper.

1. I’m basing my assertion that precipitation and relative humidity are mostly responsible for current glacial retreat on what I’ve read on this site from time to time. I don’t have links, but I’m sure there are “gunslingers” here who could provide them. (Smokey?) In particular, it’s a change to a dryer environment thanks to local land use changes (jungle replaced by farms) that is responsible for the shrinkage of the snows, etc. on Kilimanjaro.

2. Akasofu’s paper was not journal-published. He stated that because it is just a summary of material already published by others (and also, I guess, because it is over 50 pages), he didn’t bother to submit it. However, Akasofu has credibility as a scientist. I’ve read that he is one of the top-dozen scientists in his field (arctic studies, especially the northern lights) in terms of citations, and he has lots of honors. (He’s now semi-retired.)

The first one is an almost linear global temperature increase of about 0.5°C/100 years (~1°F/100 years), which seems to have started at least one hundred years before 1946 when manmade CO2 in the atmosphere began to increase rapidly. This value of 0.5°C/100 years may be compared with what the International Panel on Climate Change (IPCC) scientists consider to be the manmade greenhouse effect of 0.6°C/100 years. This 100-year long linear warming trend is likely to be a natural change. One possible cause of this linear increase may be Earth’s continuing recovery from the Little Ice Age (1400-1800). This trend (0.5°C/100 years) should be subtracted from the temperature data during the last 100 years when estimating the manmade contribution to the present global warming trend. As a result, there is a possibility that only a small fraction of the present warming trend is attributable to the greenhouse effect resulting from human activities. Note that both glaciers in many places in the world and sea ice in the Arctic Ocean that had developed during the Little Ice Age began to recede after 1800 and are still receding; their recession is thus not a recent phenomenon.

The second one is the multi-decadal oscillation, which is superposed on the linear change. One of them is the “multi-decadal oscillation,” which is a natural change. This particular change has a positive rate of change of about 0.15°C/10 years from about 1975, and is thought to be a sure sign of the greenhouse effect by the IPCC. But, this positive trend stopped after 2000 and now has a negative slope. As a result, the global warming trend stopped in about 2000-2001.

Therefore, it appears that the two natural changes have a greater effect on temperature changes than the greenhouse effects of CO2. These facts are contrary to the IPCC Report (2007, p.10), which states that “most” of the present warming is due “very likely” to be the manmade greenhouse effect. They predict that the warming trend continues after 2000. Contrary to their prediction, the warming halted after 2000.

There is an urgent need to correctly identify natural changes and remove them from the present global warming/cooling trend, in order to accurately identify the contribution of the manmade greenhouse effect. Only then can the contribution of CO2 be studied quantitatively.”

The ‘theory of natural variability’ isn’t a scientific theory since it’s incapable of falsification. On the other hand the theory of greenhouse gas warming is a scientific theory and has met all the challenges so far.

That’s nonsense.

Phil is saying that Dr Roy Spencer, our esteemed climatologist, doesn’t know what he’s talking about when he says, “No one has falsified the theory that the observed temperature changes are a consequence of natural variability.” So, who to believe? Phil? Or Dr Spencer?

Even more fun: Who to believe? Mother Earth, which is cooling as CO2 rises, thus empirically falsifying the CO2=CAGW [“AGW”] hypothesis? Or Mr Phil again? The planet’s verdict trumps anyone’s contrary opinion. The Earth is telling the truth.

And the scientific status of a theory is its falsifiability… or its refutability, or its testability. Testability is the primary criterion; falsifiability is the ideal, and results from testability. But falsifiability is not an absolute requirement of the Scientific Method; testability is [source]. And AGW is tested all the time: as CO2 rises, the climate has been flat to cooling for almost a decade. AGW clearly fails the test.

Up next: deech56 (17:34:56)

“Spencer did provide a decent analysis pointing out the flaws in Lindzen and Choi, but Monckton? C’mon. You don’t actually look to him as an authority, do you?”

Despite Mr deetch’s ad hominem attack against Lord Monckton [as opposed to trying to find any flaws in his scholarly paper], it is a fact that Lord Monckton is every bit as qualified as Gavin Schmidt [also a mathematician], or Rajendra Pachauri [an economist], or Michael Mann [a geologist], and just about all the rest of the alarmist crowd who pretend they are more qualified. They’re not, of course. Monckton is equal to the best of them, and better than most. And unlike the alarmists, Monckton isn’t suckling at the public teat. He pays his own way.

Finally, even I could find a better citation than the one deech provided by Knutti & Hegerl, who cite only Svante Arrenhius’ 1896 paper, with its high climate sensitivity conclusion – but without ever mentioning the fact that Arrenhius recanted his conclusion a decade later in his 1906 paper, with its much lower sensitivity number. By neglecting to mention Arrhenius’ correction of his earlier views, the K&H citation is little more than alarmist propaganda.

The reason this AGW scam has made it as far as it has is due to the problem deech and Phil have both avoided here: the deliberate stonewalling of information requests by skeptical scientists, and the blanket refusal to cooperate with others attempting to to falsify AGW. They don’t care about finding the truth. They only care about the money and status that result from their AGW advocacy.

McIntyre & McKitrick worked long and hard to falsify Mann’s treemometer-based Hokey Stick, when Mann should have cooperated with them from the get-go. The hiding of taxpayer financed data and methods by taxpayer financed scientists is contrary to the Scientific Method. It is based on the assumption that AGW is a fact, and that the data must be hammered into conformance, or fabricated outright, to fit their pre-conceived AGW conclusions.

If Mann and the rest really believed they had solid evidence backing up their AGW hypothesis, they would certainly have produced it. The reason they didn’t is because they knew their hypothesis would be torn to shreds, just like Mann’s Hokey Stick was.

“NASA has not been involved in any manipulation of climate data used in the annual GISS global temperature analysis. The analysis utilizes three independent data sources provided by other agencies. Quality control checks are regularly performed on that data. The analysis methodology as well as updates to the analysis are publicly available on our website. The agency is confident of the quality of this data and stands by previous scientifically based conclusions regarding global temperatures.” (GISS temperature analysis website: http://data.giss.nasa.gov/gistemp/)” [note: I could not find the specific url from NASA, so I welcome being sent this original source].

I, too, could not find this quote on any NASA/GISS web page or publication. Nor could I find an answer to Pileke’s request for corroboration on any site, pro or anti in this debate.

Has anybody yet discovered the source for this quote, or is it time to consider that it has been fabricated?

“On the other hand the theory of greenhouse gas warming is a scientific theory and has met all the challenges so far.”

Oh yes, the challenges. Well let’s see shall we?

1) Statement by James Hansen that mandmade GHG’s has given rise to a radiative imbalance averaging about 0.8 watts/meter squared from 2000 to present, which will “melt the ice, warm the atmosphere and warm the oceans.”
Prediction: From 2000 to present, ocean heat anomaly will increase by approximately 10^23 joules.

Result: Ocean heat anomaly increased by zero joules.

2) The theory that twentieth century warming is largely the result of manmade GHG’s requires that pre twentieth century temperatures must show little variation, as per the hockey stick. If historical temperature variations are similar to twentieth century variations, then the null hypothesis cannot be falsified by temperatures (although it may by other means).

If this particular plank was secure, we would expect more and more studies confirming the hockey stick. In fact, we find just the opposite: hockey sticks are being broken while more and more studies confirm the medieval warm period as a real global phenomenon.

3) GHG theory predicts that as GHG’s are added to the atmosphere, surface temperatures would increase while outgoing radiation from top of the atmosphere would decrease.

This typically happens when an organization like NASA realizes they have made a statement that can easily be contradicted by the facts: they take the page with the quote down. As you’ve found out, that page is no longer available. [I tried the Wayback Machine with no luck, but that often happens; Wayback will delete an entry at the request of the originator.]

I suspect GISS was uncomfortable with its unequivocal statement: “NASA has not been involved in any manipulation of climate data used in the annual GISS global temperature analysis.”

That is a preposterous statement. GISS constantly manipulates temperature data; I’ve posted dozens of examples of NASA/GISS data manipulation here, and so have many others.

Also, a news organization isn’t going to fabricate a statement like that out of thin air. What would be the point? From the TV station’s perspective, it’s not a make-or-break quote, it’s just NASA’s response. They’re not going to flush their reputation by fabricating it in a weather commentary. The quote originally came from KUSI, which was quoting NASA’s response: click

Hansen has hacked into my posted comments. I note that in his internet paper he suggests that temperature should be reported via a three month running average statistic (Jan Feb Mar, Feb Mar Apr, etc). Very good idea Hansen. Wished I had thought of it.

So now you take your raw data three month avg —what?—You no longer have the raw data? Not even the chadded cards? Or the floppy (yes, back in the old days they were limp disks)? Not even the hand entered data sheets from the good folks who took time out of their day to check the temperature? None of it?

This typically happens when an organization like NASA realizes they have made a statement that can easily be contradicted by the facts:

There is, then, no proof that this paragraph was ever given out by James Hansen. I do not know KUSI news service, or John Coleman, so I have no assumption to work with regarding credibility of same.

NASA and Hansen’s statements have been quite consistent regarding data and methodology – this one alleged quote is an anomaly. As you point out, GISS makes adjustments to temp data, as everyone knows and no one hides that fact. The paragraph reads fake, or like a bad paraphrase. ‘Manipulation’ is weasel wording. It’s very difficult to credit Hansen appropriating this language to defend methods.

As this quote is unsubstantiated, its provenance is neither confirmed nor denied. Any proper skeptic would come to the same conclusion.

As an aside, GISS use three different (and independent) data sets – global, US and Arctic. I wonder if there might have been some confusion here.

By the way, I did my research back in the late 80’s and early 90’s. I still have my raw data sheets. Don’t know where the chadded cards went to or the floppies (they remained at the medical lab I worked in after I moved on). But I have my original hand entered raw data sheets. Even back that far ago, I understood the important of keeping the raw data in perpetuity. And I don’t even have a Ph.D. What’s your excuse Hansen? Ya know, there are times I just wanna put you in time out!

“The ‘theory of natural variability’ isn’t a scientific theory since it’s incapable of falsification. On the other hand the theory of greenhouse gas warming is a scientific theory and has met all the challenges so far.”

You’ve got to be kidding. All you need to falsify the natural climate variability theory is show that temperatures have remained constant for all of the Earth’s history. Let us know when you’ve completed the work and we’ll review it for you.

In order to that I’d need to see this “natural climate variability theory”, what does it say where can I read about it? As far as I can tell it’s ‘anything that happens is a natural variation’ which clearly isn’t a falsifiable scientific theory, even a flat line is covered by the natural variation being ~zero!

Pamela Gray (06:46:46) :
Hansen has hacked into my posted comments. I note that in his internet paper he suggests that temperature should be reported via a three month running average statistic (Jan Feb Mar, Feb Mar Apr, etc). Very good idea Hansen. Wished I had thought of it.

So now you take your raw data three month avg —what?—You no longer have the raw data? Not even the chadded cards? Or the floppy (yes, back in the old days they were limp disks)? Not even the hand entered data sheets from the good folks who took time out of their day to check the temperature? None of it?

What raw data are you accusing Hansen of losing? Your reference to “the good folks who took time out of their day to check the temperature” implies that you think Hansen is responsible for collection and maintenance of the NOAA databases.

This typically happens when an organization like NASA realizes they have made a statement that can easily be contradicted by the facts:

There is, then, no proof that this paragraph was ever given out by James Hansen. I do not know KUSI news service, or John Coleman, so I have no assumption to work with regarding credibility of same.

NASA and Hansen’s statements have been quite consistent regarding data and methodology – this one alleged quote is an anomaly. As you point out, GISS makes adjustments to temp data, as everyone knows and no one hides that fact. The paragraph reads fake, or like a bad paraphrase. ‘Manipulation’ is weasel wording. It’s very difficult to credit Hansen appropriating this language to defend methods.

As this quote is unsubstantiated, its provenance is neither confirmed nor denied. Any proper skeptic would come to the same conclusion.

As an aside, GISS use three different (and independent) data sets – global, US and Arctic. I wonder if there might have been some confusion here.

Clearly NASA doesn’t manipulate the data, as pointed out in the comment they use it in their analysis. After they run the GISSTEMP code the NOAA data is still in the database unchanged.

If I were NASA/GISS, I also would have quickly removed this provably inaccurate quote:

“NASA has not been involved in any manipulation of climate data used in the annual GISS global temperature analysis.”

It is true that the provenance has now been scrubbed. But if you’re taking the position that it never came from NASA/GISS, then where do you propose it came from? Because it’s there, and it originally came from somewhere. Is anyone saying it was fabricated? If so, based on what evidence? Evidence is not proof. But there is the statement that the quote came from NASA; there is zero evidence that it was invented by anyone else.

When GISS uses “adjusted” data, they are using data that has been manipulated. And in almost every case, the manipulation shows a scarier result: click

NOAA also manipulates the data, and it is ridiculous to believe that GISS is not aware of that fact: click

From his public statements, James Hansen appears to be mentally disturbed. No normal person demands that businessmen engaged in a perfectly legal enterprise, which makes life much better for many millions of people, should be imprisoned for running their business according to the law. No normal person encourages people to engage in lawbreaking activities to achieve their goals, when there are legal, democratic ways to achieve the same goals.

If you would like to continue to believe that NASA/GISS doesn’t manipulate the data, it’s still a free country. You can believe whatever you like. Most of us will continue to believe our lying eyes, rather than the preposterous assertion that:

C’mon how do you get around the logarithmic effect of the absorption of CO2 and any other gas!!! Adding CO2 at this point has zero effect on its absorbance!

A logarithm doesn’t saturate. What a logarithmic dependence means is that you need the same FRACTION of increase to produce the same amount of effect. This is why scientists talk about the effect of a CO2 doubling rather than the effect of raising CO2 by a fixed amount in ppm.

Ric Werme says:

‘While appeals to century-old science may help bolster arguments about the science being settled, it also implies that temperature should be tracking the Keeling CO2 curve, but it’s not.’

No…It implies that over large enough periods of time, the temperature trend should be upward, which it has been. On times short enough that the ups-and-downs associated with ENSO and the like are dominating, the temperature trend is not expected to always be positive. And, in fact, some periods of ten or even 15 years where the least-trend fit is negative are a robust feature of climate models run with increasing greenhouse gases.

Getting back to my original comment – are you suggesting that adding 100 ppm to current CO2 levels will have the same impact that the first 100 ppm did?

No. But, I am suggesting that appeals to “saturation” are bogus, particularly in arguing that this somehow contradicts the mainstream science. The fact that the dependence of forcing on CO2 concentration is approximately logarithmic is well-known, well-understood, and in fact the reason that scientists talk about the effects in the way that they do (in terms of a doubling of concentration””

here we go with the well-known, well-understood claims again. why do I get the distinct feeling that you neither know nor understand what is going in the atmosphere joel? Perhaps you might like to explain what is happening, maybe starting with the saturation of the peaks and the real effect one might expect from a pathlength reduction of 10 cm to 9.5 cm or how the width of absorption lines decrease with dropping pressure, or maybe the shifting of the peaks because of the pressure.

Smokey, Phil, I think the issue here is semantics. Surely none of us will disagree that GISS process received temperature data such that their products – like the temperature time series – are different from unadjusted products plotted with the raw data. The word ‘manipulate’, however, is rhetorical – meant to imply wrongdoing. This language muddies the waters – brings political jargon into a scientific discussion. For this and other reasons, I am skeptical of the provenance of the quote. Here is GISS methodology.

But if you’re taking the position that it never came from NASA/GISS, then where do you propose it came from?

The position I take is ‘not substantiated’. The quote may or may not be authentic. I do not know, and am comfortable with regarding the matter provisionally.

I realize that many players in these climate debates like to believe things according to their ‘position’. It’s the commonest intellectual foible there is. Reasonable skepticism is the antidote to black/white thinking, just as curiosity is the antidote to ignorance.

The Scientific Method doesn’t only require that an upstart hypothesis must explain reality better than the established theory [usually done by making more accurate predictions, or falsifying the existing theory]; the Scientific Method also requires full and transparent cooperation from those proposing any new new hypothesis.

Why? Because the goal isn’t one-upmanship, or protecting lucrative grant fiefs. The goal is scientific truth, which is arrived at only by attacking and attempting to falsify a hypothesis by skeptical scientists – and even by those scientists proposing the hypothesis. Everyone cooperates in trying to falsify a hypothesis [or a theory, or even a Law – just as the current search for the Higgs boson is intended to verify or falsify the current model of gravity].

Whatever remains standing after all attempts at falsification is considered to be as close to scientific truth as we can get.

There’s a great deal of competitiveness in science (I worked in pharma/biotech and we weren’t any more open than we had to be), but what is accepted is that independent verification of results doesn’t mean examining one data set over and over agin, but replicating a study using information found in a manuscript. Even better, it could mean using different tools to arrive at the same result. Kind of like scientists demonstrating recent warming by direct measurement, indirect measurement by satellite MSUs and by examining the effects of warmer temperatures (glacier melting, for example).

Being ‘skeptical of skeptical claims,’ as you said, is the wrong way to look at it. The job of scientific skeptics [which includes all honest scientists] is to falsify a hypotheses if they can, whether the hypothesis is CO2=CAGW, or the theory of natural climate variability. Being skeptical of skeptics has no place in the Scientific Method; it is sophistry.

No, requiring that “skeptics” arise to the same standards as actual scientists is a way to wade through claims that are dubious or spurious. When scientific paper after scientific paper show the same conclusion, and a random web site disputes this scientific conclusion but does not submit this startling finding to a journal for scrutiny by experts in the field, I have to wonder about the veracity of the “skeptical” claim. If we don’t have some kind of minimal filter for information (otherwise, we’d be looking for medical information from Kevin Trudeau) then we’re left with so many claims that we would either throw up our hands saying we don’t know anything or pick and choose what to believe based on our biases. When “skeptics” state things that I, as a non-expert, can easily refute, I have to wonder about their reliability.

“In order to that I’d need to see this “natural climate variability theory”, what does it say where can I read about it? As far as I can tell it’s ‘anything that happens is a natural variation’ which clearly isn’t a falsifiable scientific theory, even a flat line is covered by the natural variation being ~zero.”

So, let me get this straight. You don’t know what the theory is yet you state it cannot be falsifiable. That is pure nonsense. Let me know when you get serious about this topic. These kind of made up on the fly comments by you are wasting everyone’s time.

deech56 (16:58:44) :
“Increased water and agricultural stresses and risks to coastlines don’t sound too pleasant to me. Food production is pretty important.”

Then you should be for global warming. Ask any farmer if warm or cold is better? No? You’d rather trust the ClimateGate team? Really? Why is that? The truth is most plants love heat. As long as they have water heat presents absolutely no problems. On top of that, higher CO2 allows plants to get by with less water. Not to mention that plants love more CO2 (it’s called plant food for a reason). The only “agricultural stresses” of increased heat/CO2 are imaginary.

Also, it appears you don’t question any of the AGW articles because you *trust* them as fellow scientists. Actually, that is very unscientific of you. I thought you were on the younger side because that unscientific trust has somehow become commonplace. You will learn eventually. You will find out that this situation is not unlike the 1960s in medical research. Those scientists thought they knew what they were doing as well. I think this is often the problem with new fields of study where researchers have little grasp of what they don’t understand.

Unless I am mistaken, the “ClimateGate Team” is not the group that studies the effects of AGW on agriculture, etc. You also post the false dichotomy between “cold” and “warm” when we are discussing keeping the range of temperatures around the levels we have been used to over the whole of civilization or changing things around. I don’t get your point about medical research in the 1960s; that was before my research career (unless you count my many microscopic examinations of pond water or fossil collecting), but I am no spring chicken, and I don’t understand why you are lecturing me about science.

1. I’m basing my assertion that precipitation and relative humidity are mostly responsible for current glacial retreat on what I’ve read on this site from time to time. I don’t have links, but I’m sure there are “gunslingers” here who could provide them. (Smokey?) In particular, it’s a change to a dryer environment thanks to local land use changes (jungle replaced by farms) that is responsible for the shrinkage of the snows, etc. on Kilimanjaro.

I am aware of Kilamanjaro, but I find it difficult to believe that world-wide glacier retreat is a function of local land-use changes. That’s a lot of local. Since we have strong evidence of multi-decade temperature increases, wouldn’t that be the first place to look?

2. Akasofu’s paper was not journal-published. He stated that because it is just a summary of material already published by others (and also, I guess, because it is over 50 pages), he didn’t bother to submit it. However, Akasofu has credibility as a scientist. I’ve read that he is one of the top-dozen scientists in his field (arctic studies, especially the northern lights) in terms of citations, and he has lots of honors. (He’s now semi-retired.)

OK, so I probably don’t need to bother reading it. Journals publish review articles all the time – the Knutti & Hegerl paper I cited somewhere is one example. As far as the rest, you are asking readers to go by his reputation alone. If he wants to overturn our understanding of climate, he should have enough confidence in his claims to put them before the scientific community.

3. Here’s the abstract:

Thanks, but he seems to place a lot of stock in the argument that global warming has stopped, but at the same time, writers here are claiming that global warming hasn’t happened. This leads to some confusion. Easterling & Wehner (official ref) showed that the long term positive temperature trend features periods of no apparent trend or slight cooling.

Abstract:

“Numerous websites, blogs and articles in the media have claimed that the climate is no longer warming, and is now cooling. Here we show that periods of no trend or even cooling of the globally averaged surface air temperature are found in the last 34 years of the observed record, and in climate model simulations of the 20th and 21st century forced with increasing greenhouse gases. We show that the climate over the 21st century can and likely will produce periods of a decade or two where the globally averaged surface air temperature shows no trend or even slight cooling in the presence of longer‐term warming.

Who to believe? Mother Earth, which is cooling as CO2 rises, thus empirically falsifying the CO2=CAGW [“AGW”] hypothesis? Or Mr Phil again? The planet’s verdict trumps anyone’s contrary opinion. The Earth is telling the truth.

I believe that would be Dr. Phil. Do you have a statistical analysis to support your claim that “Mother Earth” is cooling? A simple t test of a linear regression would suffice.

Up next: deech56 (17:34:56)

“Spencer did provide a decent analysis pointing out the flaws in Lindzen and Choi, but Monckton? C’mon. You don’t actually look to him as an authority, do you?”

Despite Mr deetch’s ad hominem attack against Lord Monckton [as opposed to trying to find any flaws in his scholarly paper], it is a fact that Lord Monckton is every bit as qualified as Gavin Schmidt [also a mathematician], or Rajendra Pachauri [an economist], or Michael Mann [a geologist], and just about all the rest of the alarmist crowd who pretend they are more qualified. They’re not, of course. Monckton is equal to the best of them, and better than most. And unlike the alarmists, Monckton isn’t suckling at the public teat. He pays his own way.
Also, glad to see deech acknowledge Dr Roy Spencer. He should remind Phil that Dr Spencer knows what he’s talking about regarding climate theories.

I asked a question; is that an ad hominem attack? OK, maybe I seemed a bit incredulous, but if I pointed out that Monckton is not a reliable authority on climate, that statement would not be an ad hominem attack – I could easily point to any number of analyses of his writings. Pointing out that his essay was not peer-reviewed is not an ad hominem attack, and neither is Spencer’s criticism of Lindzen and Choi (oh, and if it’s “Dr.” Spencer, that’s “Dr.” Deech. LOL). BTW, I am not necessarily endorsing all of Spencer’s writings – his LC09 critique is cited by people from all viewpoints and he has made a considerable contribution through his MSU analysis, but that doesn’t mean he is right about everything. So you don’t need to pit me against Phil.; he knows what he is talking about.

Finally, even I could find a better citation than the one deech provided by Knutti & Hegerl, who cite only Svante Arrenhius’ 1896 paper, with its high climate sensitivity conclusion – but without ever mentioning the fact that Arrenhius recanted his conclusion a decade later in his 1906 paper, with its much lower sensitivity number. By neglecting to mention Arrhenius’ correction of his earlier views, the K&H citation is little more than alarmist propaganda.

There were 101 references in the list; the Arrhenius and Callendar references were for historical purposes. Alarmist propaganda? You kind of lost me there.

The reason this AGW scam has made it as far as it has is due to the problem deech and Phil have both avoided here: the deliberate stonewalling of information requests by skeptical scientists, and the blanket refusal to cooperate with others attempting to to falsify AGW. They don’t care about finding the truth. They only care about the money and status that result from their AGW advocacy.

McIntyre & McKitrick worked long and hard to falsify Mann’s treemometer-based Hokey Stick, when Mann should have cooperated with them from the get-go. The hiding of taxpayer financed data and methods by taxpayer financed scientists is contrary to the Scientific Method. It is based on the assumption that AGW is a fact, and that the data must be hammered into conformance, or fabricated outright, to fit their pre-conceived AGW conclusions.

If Mann and the rest really believed they had solid evidence backing up their AGW hypothesis, they would certainly have produced it. The reason they didn’t is because they knew their hypothesis would be torn to shreds, just like Mann’s Hokey Stick was.

M&M did not falsify MBH98; they criticized the PCA, but we know that other PCAs (and other investigators) confirm their results. But you forget that Mann, et al. 2008 is really the new standard (well, also Mann, et al. 2009) – and Mann, et al. have provided solid evidence for their millennial-scale temperature reconstructions. The new studies build upon and improve the older studies. Science marches on. Of course, this just tells us what temperatures are and were and don’t necessarily speak to the cause of recent temperature increases.

And unless you have more reliable information, you might want to avoid speculating about scientists’ motivation.

Now for a sincere question.
Can anyone direct me to some useful paper/site/etc., or even an outright answer to this –
If we were to take the CO2 right out of the atmosphere, what would our average global temp be based on just pure physics of a black body?

The black body temperature of the Earth is 5.5 °C.[4][5] Since the Earth’s surface reflects about 28% of incoming sunlight[6], the planet’s mean temperature would be far lower – about -18 or -19 °C – in the absence of the effect.[7][8] Because of the effect, it is instead much higher at about 14 °C.[9]

Of course, CO2 is not the only contributor to the greenhouse effect, but without CO2, the earth would be cooler and the water vapor concentration would be much lower. Don’t know the numbers; maybe the references listed in this little blurb would help.

1) Statement by James Hansen that mandmade GHG’s has given rise to a radiative imbalance averaging about 0.8 watts/meter squared from 2000 to present, which will “melt the ice, warm the atmosphere and warm the oceans.”
Prediction: From 2000 to present, ocean heat anomaly will increase by approximately 10^23 joules.

If this particular plank was secure, we would expect more and more studies confirming the hockey stick. In fact, we find just the opposite: hockey sticks are being broken while more and more studies confirm the medieval warm period as a real global phenomenon.

First of all, your claim that AGW rests one way or the other on past variability is not really correct, especially since we don’t have a very good handle on natural forcings over the millenium timescale. Second of all, many studies have in fact confirmed the basic conclusion of Mann et al. that the late 20th century was warmer than the MWP: http://en.wikipedia.org/wiki/File:1000_Year_Temperature_Comparison.png So, modulo issues of how accurate the temperature proxies are (for which I admit there are valid concerns), most of the evidence supports that conclusion.

3) GHG theory predicts that as GHG’s are added to the atmosphere, surface temperatures would increase while outgoing radiation from top of the atmosphere would decrease.

Result: Lindzen & Choi show that this is not happening.

Basically, what this (and some of your other claims amount to) is, “I can cherrypick a paper from the peer-reviewed literature (where hundreds, if not thousands of papers are published in this field each year) that supports my point-of-view. Lindzen and Choi has not been out a long time and there is already a comment in the works arguing that it is fatally flawed. Heck, even Roy Spencer is very skeptical of it!

This is an extreme oversimplification of a complex issue, with various data problems…and, at any rate, does not speak to whether the warming seen is due to AGW or another mechanism, since the amplification is expected for any warming mechanism. (Also, the most direct consequence for the climate models of this amplification not occurring is that they would then be predicting a negative lapse rate feedback that does not seem to be occurring.)

5) Climate models can only hindcast the 1945 – 1976 decline by assuming a high level of human caused aerosol pollution.

Result: A recent paper by Gunnar Myher concuded that the cooling effect of these aerosols has been overstated by 30%.

Again, that is a claim made in one paper, which you choose to hold up as the gospel because it supports your pre-conceptions. Besides which, I believe that this paper spoke to only the direct effects of aerosols (unless I am confusing it with another paper), and I believe their results were within the IPCC’s stated (admittedly pretty broad) range for this forcing. And, they did not directly address whether their results are in any way inconsistent with the hindcasting (which frankly has enough uncertainties in it that such a revision might not make that much of a difference).

Basically, what you have come up with is a list of reasons to hope that AGW might be wrong, if you want to ignore the mountains of evidence on the other side and just cherry-pick a few results that agree with what you want the answer to be.

“Unless I am mistaken, the “ClimateGate Team” is not the group that studies the effects of AGW on agriculture, etc. You also post the false dichotomy between “cold” and “warm” when we are discussing keeping the range of temperatures around the levels we have been used to over the whole of civilization or changing things around. I don’t get your point about medical research in the 1960s; that was before my research career (unless you count my many microscopic examinations of pond water or fossil collecting), but I am no spring chicken, and I don’t understand why you are lecturing me about science.”

When you read about AGW problems in agriculture, they are primarily based on models. Who provides the basis for the models? Understand?

There was a peer reviewed article mentioned here a year or so ago. The article studied peer review science. What the researchers found was that 80% of all peer reviewed research was found to be invalid after 25 years. So, keep that in mind when you mention peer reviewed science. 80% of it will probably turn out to be worthless (well, closer to 100% in climate science).

My reason for the little lecture has to do with your seemingly total acceptance of peer reviewed climate science. I was simply trying to point you to the problems in your field of study many years ago. If you haven’t looked back then I suggest you do, it might make you think twice. There’s a reason double blind studies were implemented. Here’s a start.

If your study is based on adjusted data, you should keep a copy of the raw data AND the adjusted data. Hansen uses adjusted data (I don’t know if he asked for the raw data as well). And then adjusts it some more. To be absolutely clean, he should have started with raw data. He would then have reported on what he did to the raw data to adjust it, and then reported on the analysis. He muddied his own work by using adjusted data to start with. If it had been me, I would have required copies of the original raw data sheets be sent to me from who ever keeps them.

By the way, Hansen’s work has not been independently verified. Other research groups have come up with other sets of adjusted and analyzed data that says something different (re: 1934 versus 1998, etc) than what his says. That means one of two things: 1) one of them (or more) is wrong, or 2) they are all wrong.

1. I am aware of Kilmanjaro, but I find it difficult to believe that world-wide glacier retreat is a function of local land-use changes. That’s a lot of local. Since we have strong evidence of multi-decade temperature increases, wouldn’t that be the first place to look?

I shouldn’t have used the word “mostly” to characterize dry humidity and lower snowfall as the primary causes of glacial retreat. (I was mislead by thinking of Kilimanjaro as typical.) However, they are likely contributory factors. Of course, the long-term warming trend since the LIA is mostly responsible.

2. OK, so I probably don’t need to bother reading it. Journals publish review articles all the time – the Knutti & Hegerl paper I cited somewhere is one example.

They don’t publish long review articles critical of AGW — not after what happened to the journal that published the review article by Soon and B____. The Team taught those editors a lesson they won’t forget.

As far as the rest, you are asking readers to go by his reputation alone.

Equivocation. There’s a difference between saying an author has enough credibility to be worth reading (which is all that I was claiming) and enough credibility to be accepted as persuasive on trust (which is what you’re conflating it with).

3. Here’s the abstract: ….

Thanks, but he seems to place a lot of stock in the argument that global warming has stopped, but at the same time, writers here are claiming that global warming hasn’t happened.

“Here”? So what? I don’t, and he doesn’t.

This leads to some confusion.

It shouldn’t.

Easterling & Wehner (official ref) showed [“argued” would be a better term — RK] that the long term positive temperature trend features periods of no apparent trend or slight cooling.

Easterling & Wehner would have more credibility if they had made their predictions (hedged their bets) before the current flat trend, rather than after it. (Which I’m guessing is when they made it, because neither of the links you provided work.) Now it just looks like they’re making excuses. If the alarmist consensus had been asked ten years ago to give odds on a ten-year flat trend, I’m sure they would have said something like one in ten, given the way they were carrying on. I.e., instead of predicting a pause in the trend, several were predicting or speculating about an acceleration.

The current flat trend is not a decisive argument against AGW (that’s a strawman), but it does weigh against it. There’s not much room — maybe three or four years or so — in AGW theory (pre-E&W) for pauses and dips. And those are supposed to mostly be due to La Niñas and volcanic eruptions, which have been rare in the Noughties. Everything is supposed to be accounted for by known forcings and a computable heat budget, plus perhaps some latent heat in the pipeline somewhere.

We’re approaching a tipping point. If there’s no significant warming over the next five years, or decisive cooling over the next two or three, it’ll be “back to the drawing board” for AGW. In practical / political terms that’ll mean, “Don’t call us, we’ll call you.”

Even if it warms, I would have to see convincing analysis that rules out natural causes. The current jet stream, and it’s interaction with El Nino, is warming up the upper western states, as it should, now that the AO is hanging around neutral. Record high temps are being set but all due to easily explained weather systems. Therefore, you can’t take the averages from these kinds of weather related events and suddenly change the cause to CO2. If you do, you are being disingenuous.

Here is the truth. Short term weather-related temperatures are gathered and averaged over a long period of time. When you put them all into a graph, using whatever averaging technique you want, IE linear, running, or Hansenized code, it means that you now have long term weather temperatures. Period.

By the way, Hansen’s work has not been independently verified. Other research groups have come up with other sets of adjusted and analyzed data that says something different (re: 1934 versus 1998, etc) than what his says. That means one of two things: 1) one of them (or more) is wrong, or 2) they are all wrong.

Pamela, they are all ‘wrong’. They are estimates.

I assume by “1934 versus 1998” you are referring to the year of highest temperature for the continental US. I am not aware that the GISS ranking is different to any other. Could you please state what you think Hansen’s conclusion is, and then cite an alternative view. As far as I am aware, GISS see 1934 as higher than 1998 by a fraction. Where is this gainsaid?

Pamela Gray (18:21:43) :
If your study is based on adjusted data, you should keep a copy of the raw data AND the adjusted data. Hansen uses adjusted data (I don’t know if he asked for the raw data as well). And then adjusts it some more. To be absolutely clean, he should have started with raw data. He would then have reported on what he did to the raw data to adjust it, and then reported on the analysis.

That’s what he did, you can take a look at his code and papers and check if you like. From his 2001 paper on US and global surface temperature change:
“The current GISS analysis of surface air temperature change is available athttp://www.giss.nasa.gov/data/update/gistemp. The data set can also be obtained via ftp at ftp@giss.nasa.gov. The previous analysis [Hansen et al., 1999] continues to be available at the GISS web site, but it is not updated each
month as the new analysis.
The USHCN data are available from the NCDC web site at http://www.ncdc.noaa.gov/ol/climate/research/ushcn/.

He muddied his own work by using adjusted data to start with. If it had been me, I would have required copies of the original raw data sheets be sent to me from who ever keeps them.

By the way, Hansen’s work has not been independently verified. Other research groups have come up with other sets of adjusted and analyzed data that says something different (re: 1934 versus 1998, etc) than what his says. That means one of two things: 1) one of them (or more) is wrong, or 2) they are all wrong.

Re 1934 vs 1998 he has consistently said that they are in a statistical deadheat as are the NOAA results:
“The U.S. annual (January-December) mean temperature is slightly warmer in 1934 than in 1998 in the GISS analysis (Plate 6). This contrasts with the USHCN data, which has 1998 as the warmest year in the century. In both cases the difference between 1934 and 1998 mean temperatures is a few hundredths of a degree. The main reason that 1998 is relatively cooler in the GISS analysis is its larger adjustment for urban warming. In comparing
temperatures of years separated by 60 or 70 years the uncertainties in various adjustments (urban warming, station history adjustments, etc.) lead to an uncertainty of at least 0.1°C. Thus it is not possible to declare a record U.S. temperature with confidence until a result is obtained that exceeds the temperature of 1934 by more than 0.1°C.”http://pubs.giss.nasa.gov/docs/2001/2001_Hansen_etal.pdf

“In order to that I’d need to see this “natural climate variability theory”, what does it say where can I read about it? As far as I can tell it’s ‘anything that happens is a natural variation’ which clearly isn’t a falsifiable scientific theory, even a flat line is covered by the natural variation being ~zero.”

So, let me get this straight. You don’t know what the theory is yet you state it cannot be falsifiable. That is pure nonsense. Let me know when you get serious about this topic. These kind of made up on the fly comments by you are wasting everyone’s time.

So evidently you don’t know where this “natural climate variability theory” is described either. I’ve been unable to find such a theory described in scientific terms, until it is how can it be falsified?
Talking about waste of time, your unsubstantiated ramblings about the state of medical research in the past and assertions that plants like it warmer don’t bring much to the table.

deech is typical of the climate alarmist attitude when he denigrates scientific skepticism — as if the skeptics’ questioning of AGW dogma, and their requests for full and complete transparency and cooperation in providing data and methods in order to independently test conclusions, is something to be sneered at; deech claims skeptics are not, in his words, ‘actual’ scientists.

The climate alarmist crowd refuses to answer skeptics’ questions for one reason: they know damn well that if they provided full and complete records of their data, methodologies and code, their CO2=CAGW hypothesis would be quickly and publicly debunked. That would cut off their grant gravy train, so they go to extraordinary lengths to stonewall requests for information, saying in effect, “trust us.” But the climategate emails show that they cannot be trusted.

Skeptical scientists are the only honest kind of scientists. Skepticism is at the heart of the Scientific Method. Skepticism is the reason we don’t go to witch doctors when we get sick. Because of skeptical doctors who instituted the practice of hand washing, there was no longer a high mortality rate among women giving birth.

In the mid-1800’s Dr Semelweiss questioned the mainstream medical profession’s lack of concern over hospital cleanliness. His questioning of the establishment was perceived as a threat — just as skeptics’ questioning of AGW is perceived as a threat today by the alarmist establishment.

The clique that controlled medicine at the time responded by heaping scorn and ridicule on Semelweiss for daring to question their methods — exactly like deech does when he writes: “…requiring that “skeptics” arise to the same standards as actual scientists…”.

The clique that currently controls the climate peer review process, GISS, HadCRU and the rest of the climate alarmist industry have only one ‘standard’: keep skeptics out of the process at all costs, because skeptics are a threat to their lucrative grant income.

deech fails to grasp the fact that every honest scientist is a skeptic, first and foremost. It is the devious AGW clique in control that lacks honest standards.

That problem could be easily rectified by simply working with skeptical scientists to falsify the AGW hypothesis if possible, and by publicly archiving all the raw and adjusted data and methods that support the AGW hypothesis. The fact that Hansen, Mann, the CRU crew and the rest of the AGW climate alarmists refuse to disclose their raw data and methods makes it clear that they value the rewards of their rent-seeking politics much more than they value scientific truth.

Dr Semelweiss was eventually driven out of his practice by the mainstream medical clique, which felt threatened by his questioning of their methods, and who ignored the fact that he had lowered female mortality in childbirth in his hospital from 18% to well under 2% in just one year. He was attacked and ridiculed in the same way that today’s AGW skeptics are attacked and ridiculed. In the end the truth won out. Semelweiss is remembered, and the establishment clique is long forgotten.

Pamela Gray shows the right way to investigate AGW claims in her post above:

If your study is based on adjusted data, you should keep a copy of the raw data AND the adjusted data. Hansen uses adjusted data (I don’t know if he asked for the raw data as well). And then adjusts it some more. To be absolutely clean, he should have started with raw data. He would then have reported on what he did to the raw data to adjust it, and then reported on the analysis. He muddied his own work by using adjusted data to start with. If it had been me, I would have required copies of the original raw data sheets be sent to me from who ever keeps them.

If that had been consistently done, the tens of $billions wasted on the non-problem of AGW would have been available to solve actual, serious problems in other areas of science. And the public wouldn’t be questioning the honesty of scientists in general, because of the mendacious and self-serving actions of a relatively small clique of AGW gatekeepers.

deech56 (06:53:48) : So E.M.Smith (05:33:14): Do you actually believe that we’re not in a warming period?

In a chaotic cyclical series, this question without time frame is meaningless. the answer to your question, grasshopper, is ‘mu’ (whack with slim bamboo cane). You need to become the empty vessel, not present empty questions…

Yes, we are definitely in a cooling trend. It’s been strongly cooling since 6 pm and now as we approach 3 am is getting quite cold. It’s also been cooling dramatically for about 10,000 years. It’s also been cooling since the time 3200+ BC when The Ice Man ended up under a glacier and the “tropical ice caps in Peru covered plants so fast they almost look flash frozen as we dig them out from under the ice caps today. Oh, and it’s been cooling quite fast since 1998. I can also with great confidence predict that 50,000 years from now Canada will be an ice sheet.

But it’s been warming dramatically from 1817 to today. It’s also been warming since about 600 AD to today. And we have been in a whopper of a warming trend from 50,000 BC to today. I also confidently predict that tomorrow will have a dramatic and unsustainable warming trend from about 7 am until 3 pm or so.

You see, the fundamental flaw under all of the AGW clap trap is this notion that 30 years means anything. It does not. 30 years, or even 300 years is not even sand in the hourglass of time. It is less than dust in a summer breeze. ANYTHING we see on a human time scale is meaningless to the planet, for it moves in geologic time scales and to geologic degrees.

CO2 is meaningless. Human activity is meaningless. And “warming trend” is meaningless. We are, always have been, and always will be in both “warming trends” and “cooling trends” all at the same time. It depends only and entirely on how big a window of time you can grasp… and how many at once…

So we will end up in an ice age. The only question is exactly how “soon” it will start. We are presently on a very flat very unusual ‘ledge’ on the side of one of those mountains of warming, but up near the top headed for a tumble. You would like to fret over the miniscule almost invisible ripple on that ledge and think somehow we have influence over it. We don’t. We can also see that the peaks always are followed by the plunge, and that peaks have been higher than now and NEVER had ‘runaway warming’. There is no runaway warming from the top; only runaway cooling. All you can hope for is to die before it comes. (and given that it might take a 1000 years to come, that is a reasonable hope.) And if it does come starting now, you can not even hope to stop it…

So become the empty vessel and see that all is as it should be. (and you will stop asking empty questions…)