GIStemp, as its first act, takes several other data sets and glues them together. Here we will take a look at the most important one. The others are either a near subset (USHCN – the U.S. Historical Climate Network) or bits and pieces from odd places, like the bits from the Antarctic. In most contexts, the GHCN data are called “v2” in the code; for “version 2”.

The “HCN” part stands for “Historical Climate Network” but really means the land based weather stations. That is, not the satellites. “US” is inside the U.S.A. and “G” is the whole globe. Both GHCN and USHCN come from NOAA, but they have a slightly different ‘correction’ history. You may choose to download minimally corrected data or those with more corrections for things like “TOB” – Time of Observation Bias, or “UHI” – Urban Heat Island effect.

So, to understand GIStemp, we have to take a look at GHCN data. Here is the “README” file from the GIStemp download:

May 1997

This is a very brief description of GHCN version 2 temperature data and

So, there you go. Some pretty good pointers to where to get bits and what they mean. But what about these “read.inv.f” and “read.data.f” programs it mentions? Well, I didn’t see them. But I did see one named “v2.read.data.f” that seems to do the same thing.

The comment block from down in the guts of that program does a nice job of telling you what the fields are:

c ipop=population of the small town or urban area (needs to be multiplied

c by 1,000). If rural, no analysis: -9.

c topo=general topography around the station: FL flat; HI hilly,

c MT mountain top; MV mountainous valley or at least not on the top

c of a mountain.

c stveg=general vegetation near the station based on Operational

c Navigation Charts; MA marsh; FO forested; IC ice; DE desert;

c CL clear or open;

c not all stations have this information in which case: xx.

c stloc=station location based on 3 specific criteria:

c Is the station on an island smaller than 100 km**2 or

c narrower than 10 km in width at the point of the

c station? IS;

c Is the station is within 30 km from the coast? CO;

c Is the station is next to a large (> 25 km**2) lake? LA;

c A station may be all three but only labeled with one with

c the priority IS, CO, then LA. If none of the above: no.

c iloc=if the station is CO, iloc is the distance in km to the coast.

c If station is not coastal: -9.

c airstn=A if the station is at an airport; otherwise x

c itowndis=the distance in km from the airport to its associated

c small town or urban center (not relevant for rural airports

c or non airport stations in which case: -9)

c grveg=gridded vegetation for the 0.5×0.5 degree grid point closest

c to the station from a gridded vegetation data base. 16 characters.

c A more complete description of these metadata are available in

c other documentation

Unfortunately, it does not tell you just what that ‘other documentation’ might be nor where to find it…

The station data are in a file named “v2.temperature.inv” which has things like a station ID number, a name, latitude, longitude, kind of ground cover, etc. A significant part of GIStemp STEP0 is devoted to gluing together this station data with the temperature history (stored by ID number only).

In my opinion, it would be far better to load all the temperature and station data into a simple relational database rather than jump through all the hoops that GIStemp does. That would eliminate much of the confusion and strongly simplify the code.

The USHCN.v2 data is far more heavily “adjusted” so there is more “warming of the historical trend” from re-writing the past to be colder, but at least now we are using more of the USA thermometers. Now all they need to do is put back the 85% or so of the thermometers in the rest of the world that were deleted from the record from 1990 or so to date (yet left in the baseline periods…)

I have heard a report that the GHCN data set will be “improved” with the same “adjustment” method put into USHCN.v2 and we will have to wait and see if this too increases the warming slope for the rest of the world…

6 Responses to GHCN – Global Historical Climate Network

As you will have seen from my postings on WUWT I have a particlar interest in History, and in the context of your blog here the nonsense that is ‘global temperatures since 1850.’ I am aghast at the idea that so few stations -102 in 1850 (many known to be unreliable) are being used to chart our climate since that date. Since then the numbers have waxed and waned more often than the Moon!

I have put together a selection of national temperature databases (being more rational than a global figure) and have been looking for a good sccount of the number of stations in each year, their location, methodology, accuracy, change of location etc etc. Have you come across such an account-I guess it must exist somewhere as it is the backbone of the AGW scenario.

If the sheer inconsistency of the network can be demonstrated -and the nonsense of parsing fractions of a degree back to 1850 illustrated-the whole hypotheses of AGW is shattered.

Many of the existing stations are accurate only to +/- 2 deg C at best. Who knows what the accuracy was in 1850? Yet these AGW people are proposing taxes in the trillions and gross regulation of our lives on the basis of a supposed temperature change (aneraged over the whole globe!) of fractions of a degree.

Consider the difference between the adjusted and the raw temperature data in GHCN’s v2.mean data bases. The difference (adjusted minus raw) shows sharp upswings in recent years — primarily but not exclusivelly for the northern hemispher’s winter months. The defference between the adjusted and raw temperature data also shows other peculiarities. For example, the adjusted temperatures for the period 1700 thru about 1825 is about 8C higher than the raw values. I wonder about these peculiarities.

Also, if one looks only at the adjusted v2.mean data base, it contains either 57 or 58 entries for the months of Jan – Aug of 2008. The raw data base contains less than 1500 entires for any month in 2008.

The number of entries hit a peak of about 9100 in 1969-1971. Anybody have any idea why the number of entries dropped from 9100 in those years to less than 1500 at present?

Postings By Date

Prior Months; postings by date

Meta

To Donate via Paypal or Credit card

Paypal Donation Site.
To make a donation, visit Paypal at the link above and put in the email address pub4all @ aol (DOT) com (leaving out the gratuitous blanks and putting in a "." for (DOT) that is in the text here to defeat spam bots). Many thanks to all!