NCDC’s new USHCN hockey stick trick

Yesterday, NCDC released Version 2.5 of the USHCN data set. For those who don’t know, this is the U.S. Historical Climatology Network (USHCN) which NCDC considers a “gold standard” for US temperature measurements. Problem is, the reality of the old network is that it is fraught with all sort of inconsistencies throughout its record such as multiple station moves, equipment changes, time of observation changes, encroachment by urbanization, and of course faulty station siting which I discovered that only 1 in 10 USHCN stations met the criteria of NOAA’s 100 foot rule, a charge backed up by an investigation done by the U.S. General Accounting Office (GAO). As a result of the most recent research, we found significant positive biases in the raw data:

Of course the line from NCDC is always that “none of this matters” and that such things can be solved by adjustments. After seeing the differences between the USHCN2.0 and USHCN 2.5 data set, I ask: “in what temperature measurement universe does a hockey stick like this occur“? See below.

Since 1987, NCDC has used observations from the U.S. Historical Climatology Network (USHCN) to quantify national- and regional-scale temperature changes in the conterminous United States (CONUS). To that end, USHCN temperature records have been “corrected” to account for various historical changes in station location, instrumentation, and observing practice. The USHCN is a designated subset of the NOAA Cooperative Observer Program (COOP) Network. USHCN sites were selected according to their spatial coverage, record length, data completeness, and historical stability. The USHCN, therefore, consists primarily of long-term COOP stations whose temperature records have been adjusted for systematic, non-climatic changes that bias temperature trends.

Did you know — the National Climatic Data Center periodically improves the quality of the datasets maintained at the center and releases updated versions’ Beginning with the September 2012 processing, NCDC will use USHCN version 2.5 for national temperature calculations as well as in other products, including Climate at a Glance and the Climate Extremes Index.

The page at http://www.ncdc.noaa.gov/oa/climate/research/ushcn/ describes some the procedures, but makes heavy use of references on published papers, rather than provide an operational flowchart. This is an impediment to replication, and I suspect that given the mishmash of references they provided, few if any would be able to replicate the process fully.

Steve Goddard did an additional analysis and showed how the temperature changes over the different phases of the USHCN data:

Also in the monthly report from NCDC is this statement:

The average contiguous U.S. temperature during September was 67.0°F, 1.4°F above the 20th century average, tying September 1980 as the 23rd warmest such month on record. September 2012 marks the 16th consecutive month with above-average temperatures for the Lower 48.

According to the “platinum standard” state of the art US Climate Reference Network (USCRN) which has none of the problems and requires none of the adjustments of the problem plagued and aging USHCN network, the CONUS monthly average was 66.0°F.

That puts the temperature difference (above normal) at 0.4F, within the bounds of standard deviation. Ever wonder why NCDC never mentions the new state of the art USCRN data in their State of the Climate press releases but prefers to rely on the old network and its wonky hockey stick like adjustments? This is why.

They are heading down a dead end street. It is only possible to keep adding adjustments for so long before they become blatantly obvious to even the most blinkered disinterested observer. Seeing that they’re paid out of the government purse for their work, the looming prospect of criminal fraud charges must be worrying for some of them, but who will be the first to turn State’s evidence?

[SNIP – Mr. Barney, I’ve told you in the past that your vicious comparisons are not welcome here. Even though I’m not a fan of what NCDC does, I’m even less a fan of your categorizations and parallels which you attempt to ascribe Marxist and other connections. Get off my blog – Anthony]

This is from the Joe Biden school of Warmist Alarmism. Where deranged behavior passes for debate.
Does any one remember this wild temperature extreme? Why no. Is it reflected in atmospheric satellite temperature readings? Why no. Near surface ocean satallite readings? No again. Any readings in the Southern Hemisphere swing upward like this? No again . Siberian inland readings? No. Alaska inland? No. Greenland Northern? No.

If and when there is a sufficiently calm, rational, and neutral body with sufficient power, recognition, resources and competence, we might hope for a deep ‘official’ audit of adjustments within global and national temperature data sets and algorithms. In the meantime, we can but welcome posts such as these which highlight curious results.

Dishonest is not strong enough of a word for these shenanigans. Write your representative and senators on this matter if you deem them honest, most know what to do, defunding sticks to civility. They can categorize the departments and lobbyists as they appear with hands held out.

I hate to be suspicious and I haven’t trace this instance down yet to see if it has occurred but I just bet they gave TX & OK a big upward boost in the latter years since we have mentioned the flatness more that a few times here on WUWT. That is what I am watching the closest, the adjustments to data cued by simple contrary mentions by commenters here. Has anyone else noticed this tendency? It’s kind of like — shut up or we will make the ‘data’ to always prove you a fool (but the old data snapshots tells a different story, like the second graph above).

Probably a lot like the Bureau of Labor and Statistics when it was discovered that California’s jobless claims numbers weren’t in their latest report. They “didn’t leave it out,” California didn’t submit it in time.

Fraudsters. Sadly not stupid enough to alter today’s temperatures, notice that the pivot point is today’s temperature.

One wonders what the future holds for these prostitutes of science ?

If they succeed in panicking a compliant puppet government (Dimocrats) into destroying the US economy (also here in the UK, OZ, Germany) leaving the population unprepared for a possible mini ice age, the US people may then appear less than grateful.

There are better exit strategies than such a gung-ho amateurish response, time will tell if they can muster the professionalism and necessary sense of judgement to formulate and execute a more sophisticated exit strategy.

Temperatures are currently plateaued for a year or three at the top of the solar cycle, after that down is the only possible direction, a few years later the s89t should start to hit the fan.

I have a question. These “adjustments” have obviously been made to past records. Are they only to the final result or have they been made to the individual readings to obtain the final result?
As I’ve mentioned before, I have the records for where I live from 2007, 2009 and 2012 and they don’t match up. Have they messed with the individual records to produce yet another “hockey stick” or have they mainly messed with the final result? A combination of the two?
Again, this is a question. Do we know what the method is?

You mean the actual code doing those adjustments (with full documentation) is not published?

As code development is payed for by (American) taxpayer’s money, it can’t possibly be protected by copyright, can it? I do not think either it was classified. But even if it was, there can be no justification for that move whatsoever, so it can be challenged before court.

On the other hand, if none of the above applies, the code base, full docs included, is subject to FOIA.

That’s what you should go for. However, be wary and never let them to get away with giving out low quality scanned images of a printout, as they are inclined to do just that. Go for a full digital copy instead, for no one wants to bother with a tedious OCR job.

Once we have it in the open (let’s say in a public repo on github), it is only a matter of time & effort to uncover the full diverse set of hidden tricks that make the adjusted dataset trend up faster than even its worse subset does. The job is definitely doable and I am quite confident there are plenty of qualified experts on the net, willing to donate their time to such an extensive code audit.

A professional report to be published online following such an effort, including all drafts in a timely manner, with proper version control in place, would be a bomb. Especially its executive summary for policy makers and the press release about it.

It’s all about showing significant positive biases. If Nature won’t co-operate then you just have to massage the observations until they are correct. It is to be hoped that eventually people will notice that the weather is cold and the observations are hot, as in “fell off the back of a truck”. Then their funding borrowed from Chgina can be cut and they can get a real job at a power station feeding out the electricity to warm people up again.
Here in Australia is just snowed in States from tropical Queensland to the wildernesses above Adelaide in South Australia!

ntesdorf says:
October 13, 2012 at 1:35 pm
… Here in Australia is just snowed in States from tropical Queensland to the wildernesses above Adelaide in South Australia!
======================================================
I bet it was “warm” snow.

Their “homogenization” algorithms seem to have decided that heating effects from poor station siting are much less than cooling effects from a change-over to electronic sensors. How much work would it take to prove or dis-prove this…?

“…Moreover, the sign of the bias is counterintuitive to photographic documentation of poor exposure because associated instrument changes have led to an artificial negative (“cool”) bias in maximum temperatures and only a slight positive (“warm”) bias in minimum temperatures. …”

I see the point they’re trying to make…but I don’t buy it…but I also don’t have the time to try to debunk this…I’ll be curious to see what others can come up with.

That puts the temperature difference (above normal) at 0.4F, within the bounds of standard deviation.
——–
If that temperature difference was derived by subtracting the average for two different temperature networks, that would be the wrong thing to do wouldn’t it?

Kit Carson? Isn’t that one of those unCOOPerative stations? Seems akin to the one next door in Kansas, Dodge City I think, that recorded 121F an hour after midnight! Maybe we have someone at NOAA trying to write a bit of the old Wild West back into the temperature records. 8-)

this is truly the beginning of the end…much more at the link, including Met Office dissembling:

13 Oct: Daily Mail: David Rose: Global warming stopped 16 years ago, reveals Met Office report quietly released… and here is the chart to prove it
The figures reveal that from the beginning of 1997 until August 2012 there was no discernible rise in aggregate global temperatures
This means that the ‘pause’ in global warming has now lasted for about the same time as the previous period when temperatures rose, 1980 to 1996
The world stopped getting warmer almost 16 years ago, according to new data released last week…
The new data, compiled from more than 3,000 measuring points on land and sea, was issued quietly on the internet, without any media fanfare, and, until today, it has not been reported…
Some climate scientists, such as Professor Phil Jones, director of the Climatic Research Unit at the University of East Anglia, last week dismissed the significance of the plateau, saying that 15 or 16 years is too short a period from which to draw conclusions.
Others disagreed. Professor Judith Curry, who is the head of the climate science department at America’s prestigious Georgia Tech university, told The Mail on Sunday that it was clear that the computer models used to predict future warming were ‘deeply flawed’.
Even Prof Jones admitted that he and his colleagues did not understand the impact of ‘natural variability’ – factors such as long-term ocean temperature cycles and changes in the output of the sun. However, he said he was still convinced that the current decade would end up significantly warmer than the previous two…
The regular data collected on global temperature is called Hadcrut 4, as it is jointly issued by the Met Office’s Hadley Centre and Prof Jones’s Climatic Research Unit…
Not that there has been any coverage in the media, which usually reports climate issues assiduously, since the figures were quietly release online with no accompanying press release – unlike six months ago when they showed a slight warming trend…
Like Prof Curry, Prof Jones also admitted that the climate models were imperfect: ‘We don’t fully understand how to input things like changes in the oceans, and because we don’t fully understand it you could say that natural variability is now working to suppress the warming. We don’t know what natural variability is doing.’…http://www.dailymail.co.uk/sciencetech/article-2217286/Global-warming-stopped-16-years-ago-reveals-Met-Office-report-quietly-released–chart-prove-it.html

Brilliant work! D’you suppose, if I asked them very nicely, they’d send me a pack of their “special” algorithms? Then I could apply them in reverse to my gruelling uphill slog home from the shops, and make it an easy downhill cruise. Or do these dishonest fiddles (as most of us seem to think) just not apply to the real world? (i.e. they’re Al Gore-ithms.)

Already those adjustments are skewed from reality. Surely this couldnt go on much longer, global temps are cooling and heading down and they are still adjusting these temps up totally against the trend.

I greatly admire Anthony for his tenacious efforts to demonstrate how the National Climate Data Center continues to use highly questionable data and processes to produce statistics that support the global warming agenda. Anthony, you are a work horse (in the finest sense) and hero of mine. However, when all is said, posted, reported, touted, argued and done, I fear it is all for naught. I give my perspective on temperature records in a new blog on my website. The blog can directly be reached athttp://www.kusi.com/story/19778143/a-blog-about-temperatures
The final line of that lengthy blog is

“I conclude the temperature data does not prove global warming. The alarmists are wrong. But the temperature data is so unreliable and garbled that neither alarmists or skeptics can use it to conclusively prove they are right.”

Two things:
1) there are at least 4 different phases here all of which might need individual correction.
2) given the list of growing problems with these sites, how does on end up with a positive correction in the latter decades?

To be honest the only part of this that looks like it might be valid is the period up to 1940.

If Romney wins, that will be the time to write to your Congressman, Senator and to the White House, requesting that new people be put in charge of the NCDC.

The record has to be fixed now and we need real statisticians and people from national statistical agencies to take over. No climate scientists should work at the NCDC until the motivation to exaggerate the record is removed.

Like Prof Curry, Prof Jones also admitted that the climate models were imperfect: ‘We don’t fully understand how to input things like changes in the oceans, and because we don’t fully understand it you could say that natural variability is now working to suppress the warming. We don’t know what natural variability is doing.’…”
===============================================================
Just thought I’d point this out in case Bob Tisdale missed it.
Maybe Prof. Jones should seek his advice?

ntesdorf says
Here in Australia is just snowed in States from tropical Queensland to the wildernesses above Adelaide in South Australia!
————
I heard about that to. I love snow, sorry I missed it.

Now where I am in the southern part of Australia it’s not cold at all. Outside it’s a brilliant blue sky and the sunlight is filtering through the sparkling greenery. Nice and toasty, a perfect spring day. How’s that for evidence of global warming?

in which he shows that of 5 cold stations and 1 warm station, the 5 were adjusted up to match the warm one ! The very opposite of what should have happened. This is not science, these are people with a very destructive agenda. Are there any honest scientists at NCDC ?

NCDC, GISS, NOAA, all need to be closed down, all staff fired, all jobs re-assessed, and re-advertised, with an especially careful vetting and recruitment process to completely start afresh with a clean slate.

The US government should take a Tabula Rasa approach to the clearly obvious corruption of science which is being increasingly practised at these US government and other establishments.

If it turns out that the Sun is going through a brief quiet period after which temperatures storm on up to a repeat of the medieval optimum, then the co2 scaremongers will have won and will bring about the destruction of the Western economy. At least it’ll be warm.

But if, as seems more likely, the Sun is entering an extended quiet period, then we have an interesting race on our hands; can the CAGW prophets and acolytes destroy the Western economies before seriously declining temperatures bring the politicians to their senses.