Collated GISS Versions Online

I’ve loaded R-tables for the dset=1 and dset=2 versions. The R-tables are lists each 7364 long, each item is a station time series. The files are about 8 MB in size. I have a variety of little scripts to retrieve and analyze things. The data is located here:

It would be easy to make a NetCDF file from this and I’ll post one up if someone sends me a conversion script. I thought about posting up an ASCII version but it’s about 4 times larger and NOBODY should be using Excel for this type of thing. Get with the R-program.

Willis (#1), My computer (MacBook) seems to balk at the funny quotes in your R command:con = url(paste(http://data.climateaudit.org/data/giss/giss.dset1.tab))
However, it works fine with SteveM’s command once you replace the – and .lt. with an “=” (or push the two characters together to form the assignment operator).con = url("http://data.climateaudit.org/data/giss/giss.dset1.tab")

I thought about posting up an ASCII version but its about 4 times larger and NOBODY should be using Excel for this type of thing. Get with the R-program.

There’s a comment that is getting as predictable as rain in Seattle. Unfortunately, after attempting some of these excercises with Excel, I have found that I have no choice but to follow Steve’s advice. I have had one false start on R, but I have noted that other posters here have evidently gotten with the program. I guess if I can remain content to learn from the information as presented by other users of R, I would not have to learn R. Unfortunately I like apparently others here like to play with data our way.

Mean while, I anticipate some better understanding of the processes that go into the GISS data set from these excercises.

#12. Steve Mosher, I can’t figure out how to make an R-package. I’ve read some manuals but am stumped. If anyone knows of a template that was used for making a package, I’d be happy to work up a package. Actually there are about 10 packages that I could make.

#11. Ken, I’m not saying this to be tiresome. There are benefits from a standing start ranging from the ability to download and read data into a program; R functions are easier to manage than Excel macros; you can handle much MUCH bigger data sets, it goes on and on. You’ll be able to do anything that you can presently do in Excel in about 15 minutes.

I have been tackling R off and on for a few months. While I can’t describe exactly why, it *is* difficult. And heck, I used to be pretty much an expert in Fortran and semi-fluent in Delphi.

I have purchased a couple of books, but as soon as I need to do something slightly different and deviate from the examples provided, I am usually stuck for hours. The help files only occasionally *help*, as you almost need to be an expert to understand them.

Despite the learning curve, however, I’ve seen and learned enough to know that is the last language I will ever need or want to know.

You are not being tiresome. I just had to comment, since when I saw a post that commented on downloading to Excel, I noted to myself that your reply was coming shortly. The other day, I could not download an .nc file, so I searched (and searched) for a source with a text file that I could download. I found it and felt a little satisfied, but realized full well that if one has a hard head one better have a strong back.

USHCN is a collection of 1221 US stations in 4 flavors: raw, TOB-adjusted, adjusted, and urban-adjusted. USHCN is a subset of GHCN, which has 2 flavors, raw and adjusted. GISS uses GHCN and has two versions: raw and adjusted. Their raw version is sort of like GHCN adjusted, except when it’s GHCN raw. Many if not most GHCN series are not updated and current dated tends to be the online WMO network at GHCN daily.

Which source of data is ‘considered’ to be the authoritative source when describing US temperatures – GISS or NCDC, I guess that would mean either NASA or the National Climatic Data Center? Sorry if I may seem a little thick, but I understand that NCDC still clings to the old saws about the last decade being the warmest years and that they haven’t accepted Hansen’s corrections.

Worked for me using a tablet computer, quite a surprise, also XP 64 bit.

I suggest go back and review R documentation, pick the section dealing with grammer and syntax and then watch proper use of quotes, parenthesis, brackets, having a reference as to what Should Be Correct.

I think that NOAA uses USHCN with some sort of regional weighting. To add to the brew there is USHCN version 1 and USHCN version 2( scheduled for release in July 2007 but still not released other than press release results).

I’ve developed an R resource on http://www.stikir.com. It should be useful for those of you having trouble installing R, finding it difficult to climb the learning curve, or having problems accessing or using the large climate datasets.

It’s a wiki site where you can enter R scripts. The server will run it and provide your results. Everyone can see and modify your analysis. Using this technology the provenance of the results is clear. The data, the analysis, and the results are all fully transparent and auditable.

If you press the “edit” button on the page, it should make sense how the tool can be used.

The sample analyses are not particularly clever, but are provided to illustrate the functionality. I’ll happily lend a hand to people wanting to publish any more substantial analysis.

The GHCN v2 temperature and precipitation data has been made available, as well as some Energy Information Administration data on carbon emissions. Steve, please let me know if this collated GISS data is in the public domain. If so, I’ll upload it to the site, along with any other data sets this community might be interested in.