Now, since I don’t know the real reasons behind such interesting choices in gauge placement, I’ll just ramble a bit…

Can the people who place/monitor/operate these temperate stations really be that sloppy? That incompetent? Are they too lazy to walk a few extra yards to a well-sited station? Too lazy to install some form of data transmission device? Or is it just an “Any data that fits the current thinking is good data and won’t be questioned…” type of post-modern “science?”

Maybe all the available money went into the supercomputers (to crunch up new models) and there’s none left to do a decent job of collecting real data?

Is there any reason that we should consider the surface record to be worth anything?

Roger Pielke Sr. has postedan interesting article that takes what Anthony Watts says in a FoxNews article and adds an important point about the reliability of the thermometer record.

As background – the surface station network used to measure temperature trends has come in for some richly justified criticisms along the lines of: missing data, station changes and movements, station positioning near warm things (eg: air conditioner outlets, BBQ pits, etc.,) operator error, and careful massaging of the available data.

A generally accepted number for the amount of warming over the last 100 some odd years is .6 +- .2 degrees C. According to surfacestations.org a lot of the stations used to measure temps have an error of >= 2C.

So how the heck do you get an error measurement of 0.2 when your thermometer has an error of +-2.0??

Apparently if you throw a bazillion measurements into the mix the errors tend to cancel each other out. As an extreme, consider this: an atom (or a molecule) is mostly empty space and subatomic particles tend to move somewhat randomly. Yet that baseball bat is very sold, very predictable, and make a nice noise when you hit a homer.

So maybe if we had a million (or several million or maybe 50,000) thermometers scattered all over the world their errors would tend to cancel (some high, some low) and you could get a small error measurement like tenths or hundredths.

But what if there are only a few hundred? Or a few dozen? And what if you calculation software adds a small error of it’s own? Or if that software isn’t perfectly programmed and adds some other error? You certainly won’t get tiny error ranges.

I ask “Where’s the Beef?” and folks offer Holy Hypothetical Cows

Whenever I’ve raised the issue of precision and accuracy drift in GIStemp, the discussion has ended up with folks offering all sorts of reasons why hypothetically you can get a gazillion bits of precision out of a large average of a bazillion things. Then I point out that we have only, at most, 62 values going into the monthly mean (and that done in 2 steps, with opportunities for error and accuracy drift). And that then those values are used for all sorts of other calculations (homogenizing, UHI “correction”, weighting, all sorts of things) before they ever approach the point where they are finally turned into “anomalies”. Even then the method used does not always compare a station with itself. It is more a “basket of oranges” to a “basket of apples”. (And some times there are as few as ONE station forming the “anomaly” for a given GRID box…)

Still, the Hypothetical Cow gets trotted out on stage each time the issue is raised. A Hypothetical Cow, we are told, has near infinite accuracy and precision due to the central limit theorem and the law of large numbers (which, in hypothetical land, can even be applied to small groups of real numbers…)

Jim Hansen, Chief Alarmist of GISS, says, “…the US time series which (US covering less than 2% of the world) is so noisy and has such a large margin of error that no conclusions can be drawn from it at this point.” Keep in mind that the US series is the gold standard, which means the rest of the world’s measurements are is worse shape.The same article points out that current temps as measured by the surface record are not significantly warmer than the 30s & 40s.

Jones says, ““The major datasets mostly agree,” he said. “If some of our critics spent less time criticising us and prepared a dataset of their own, that would be much more constructive.”

How can it be constructive if all critics of Jones’ (and other warmist) work is universally derided as “flat earth thinking?” What really needs to happen is that the proponents of warming need to be a lot more respectful of the skeptics than they are.

Also, the critics have done a lot of work with the data and it’s all over the web (Chefio has a lot of it.) Secondly, if Jone’s work was robust then the criticism wouldn’t be an issue.

Lastly, far more work shows the MWP as at least as warm as today, and maybe as much as 2C (or more) warmer. Very few papers beyond the discredited Hockey Sticks show that it was cooler.

So says Anthony Watts of Watt’s Up With That? He and others, especially E.M. Smith, aka “Chiefio,” have put together a PDF that analyzes the surface temp record (it isn’t pretty )and they include a one page summary for polcy makers.

To summarize the overall issue (and note that people are discussing whether or not each point is actually important:)

Many of the measurement stations used are very poorly placed

Stations are moved, equipment is upgraded or changed, standards change, data doesn’t get recorded, used, or is lost, etc.

A bias, for whatever reason, has been shown that cools pre-1960 temps and warms post 1960 temps. This, of course, has the effect of exaggerating any detected warming.

Heat island effects seem to be poorly corrected

75% of the stations are no longer used in the temp record. Coincidentally most of these stations were from cooler/rural areas.

And on and on.

From WUWT:

As many readers know, there have been a number of interesting analysis posts on surface data that have been on various blogs in the past couple of months. But, they’ve been widely scattered. This document was created to pull that collective body of work together.

Of course there will be those who say “but it is not peer reviewed” as some scientific papers are. But the sections in it have been reviewed by thousands before being combined into this new document. We welcome constructive feedback on this compendium.

8/26/14: Found this (referenced on G+: ) – Atmospheric Carbon Dioxide and Aerosols: Effects of Large Increases on Global Climate.

Effects on the global temperature of large increases in carbon dioxide and aerosol densities in the atmosphere of Earth have been computed. It is found that, although the addition of carbon dioxide in the atmosphere does increase the surface temperature, the rate of temperature increase diminishes with increasing carbon dioxide in the atmosphere. For aerosols, however, the net effect of increase in density is to reduce the surface temperature of Earth. Because of the exponential dependence of the backscattering, the rate of temperature decrease is augmented with increasing aerosol content. An increase by only a factor of 4 in global aerosol background concentration may be sufficient to reduce the surface temperature by as much as 3.5 ° K. If sustained over a period of several years, such a temperature decrease over the whole globe is believed to be sufficient to trigger an ice age.

8/17/14: Our Climategate buddies told us that they wanted to get rid of the 1940’s temperature blip, without even knowing why it existed. So they did.

5/12/13:

In 1998, Hansen wrote: it is clear that 1998 did not match the record warmth of 1934, which occurred during the Dust Bowl era. Here.

Update 4/29/13:

It’s necessary for NCDC to heavily adjust temps to continue the alarm. The high temps in its published reports are far from what’s shown by their data. The animation below switches between the measured data, and the published temperatures.

He (James Hansen) wrote this in 1999, before he completely turned to the dark side of fraudulent science.

Empirical evidence does not lend much support to the notion that climate is headed precipitately toward more extreme heat and drought. The drought of 1999 covered a smaller area than the 1988 drought, when the Mississippi almost dried up. And 1988 was a temporary inconvenience as compared with repeated droughts during the 1930s “Dust Bowl” that caused an exodus from the prairies, as chronicled in Steinbeck’s Grapes of Wrath…..

in the U.S. there has been little temperature change in the past 50 years, the time of rapidly increasing greenhouse gases — in fact, there was a slight cooling throughout much of the country

Update 2/4: More from that tireless Goddard fellow. Far more heat records set in the 30s than this last year. Now one could probably quibble about averages vs record setting (ie: a high average might not mean more records) or about this just being US temps. Funny how such scrutiny isn’t given to hysteria supporting the warming meme. 40% Of US All-Time Record Maximums Were Set During The 1930s

Update 1/25: PDF of a German study of doctored GISS records. It’s all in German, but the conclusion (on the last page) is in English. Here’s the original and here’s the local copy. Some people are having difficulty loading the original file, but the local copy seems to work just fine.

NY Times hysteria about melting ice over the last century, WUWT comment.

More headlines, from various sources. Part one of that article is here and discusses the “Global Warming” proposition.

An issue is that pretty much all of the warming observed over the last hundred years or see seems to be the result of adjustments to the historical temperature record. This image tracks some of those adjustments to the NOAA adjustments. Temps were modified to be cooler in earlier years and warmer in later year. The graph shows the magniture of the adjustments.

This is for the wags who whine that the US is only 2% of the Earth’s surface area, so doesn’t really count: Just 2% Of The Planet. More on the Bad Weather page.

Update 7/22/11:

From Goddard, Steve: – Prior to the year 2000, the two hottest years in the GISS US record were 1934 and 1921 respectively. 1998 was the third hottest year, and more than half a degree cooler than 1934.

After adjustments, 1998 became warmer. Here’s a blink graph, from Steve’s site, comparing the two. Click the image for a slightly bigger (and complete) picture.

Update: another post on the subject, with lots of links in the comments to more interesting stuff.

In 2000, USHCN apparently wasn’t happy with the fact that the 1930s was the warmest decade – so they gave the past a demotion and bumped the 1990s way up….

Recently there’s been some news about some fudging of the numbers in the land based temperature data used to determine which way the climate is heading. This isn’t the same thing as another manipulation of the temperature record, though it may be related…

The Cricket discusses number fudging

Check out any temperature record going back to the beginning of the 1900s and you will see a bump in the 30s and 40s. 1934 was the hot point of that bump.You’ll even see this in the various (discredited) hockey stick graphs. Of course, pretty much all of those sticks and other graphs show current temps to be significantly warmer than the 30s.

So why is this interesting? Because, depending on a couple of things, it seems that 1934 was actually about as warm as 1998. Which pretty much kills off any hockey stick. (update note: this is for US data, but since the US data has been called the gold standard for temp data I think my remarks still apply since they are regarding manipulation, even though ’34’s warmness may have been regional.)

How did ’34 get so warm (or cool off?) a few years back 1934 was originally shown to be as warm as 1998, then NASA adjusted the data to show 1998 as warmer. Then Steve McIntyre, of Climate Audit, pointed out some errors and the “official” temperature data was adjusted again to show 1934 as the high point. In fact, only 3 of the top ten dates, from that set of data, are from the current decades.

Naturally the alarmist crowd said that the adjustments (boosting 1934) were insignificant. CA has more discussion on this issue, including this one: Does Hansen’s Error “Matter”?

The temps have been adjusted again, a few times, since then. This is why you generally won’t see graphs showing ’34 to be close to ’98. In fact, here’s a graph that shows the competition between 1934 and 1998 as the warmest years. Looks like someone can’t make up their minds…

The Air Vent has more detail on this, noting that ’34 has been as much as .5 C higher than ’98. Doesn’t sound like much until you realize that the entire trend from 1895 to now is about 0.6C

A quick detour:

Ok, there are a LOT of issues with the land based temp records. SurfaceStations.org does a nice job of questioning the quality of these stations. Location quality, station quality, areas covered (and not covered,) recording error, heat island effects, physical moving of the stations, physical changes to the station (paint, screens, gear., etc.,) and probably more.

There are also questions of data manipulation shown in the Climategate mess (including the loss of raw data) and some other emails that show NASA manipulation of data. Currently it looks like the surface temperature record might be called unreliable.

End of detour

Let’s pretend that the surface record is basically good and get back to the adjustments. Note the above linked table, on CA, of the top 10 warm years.

Now, the real reason that this adjustment stuff is interesting is this: If it can be shown that the 30s were as warm as the 90’s then:

the Catastrophic Warming idea is dead.

The world may well be warming, generally, but Man’s part in it is tiny and any warming, long or short term, is due to natural causes and there has been no increase in warming since the 30s peak.

Al Gore (all other hysterics) and the UN IPCC are selling pure fantasy.

With there being no there there, grant monies for AGW research will fall precipitously.

Not to mention that all of the CO2 power and money grabs, real and planned, are reduced to nothing more than power and greed.

There’s a major vested interest, by many parties, in maintaining the idea that the last few years were definitely warmer than the 30s. Sales of Al Gores’ books and movies, for example.

Are current times warmer? That’s very hard to say. The satellite records just don’t go back that far. They only go back to a time when the news media was pumping “the coming ice age.” Satellites can’t say a thing about 1934. The surface records, which should be able to say something about ’34, are iffy, at best.

So if you believe in AGW, and/or that the keepers of the data are 100% honest and that they are doing the best they can with an iffy product, then ’98 was warmer the ’34 peak and the modern warming took us to a higher level than that reached in the 30s and 40s. That doesn’t say that people were responsible for that warming, but it opens up the possibility.

If you believe that the data has been fudged and that 1934 was about as warm as the ’98 peak then the more recent warming is just a rebound from the 50 & 60s dip, not an increase from ’34. This means that there is no longer term warming and Mr. Gore should get a new job.

Update:

Another comparison of the 30s/40s to modern times. Also explains why there was a “concern” about the coming ice age in the 70s. BTW – everytime I see that second graph the 40s hump is a bit smaller and the modern hump is bigger. Now imagine the earlier hump being the larger one. That’s what the data looked like before the various “adjustments.”

Update:

another page that makes a similar point:

NASA’s James Hansen is the United States’ leading scientific alarmist about global warming. He believes global warming is accelerating. Apparently it’s his revisions of the data that are causing the acceleration.

This document examines the historical revision in the global temperature change as defined by Hansen over the decades. Hansen’s global temperature graphs are examined from 1981 to 2007.