Saturday, July 30, 2011

I found the latitude/longitude info in CRUTEM3 to be generally better than in some earlier versions of GHCN. There seemed to be fewer cases of stations turning up in the sea - the KML files that I did make this rather obvious.

However, there were some down as missing data. TempLS needs some lat/lon info - perfect accuracy isn't necessary, but for the weighting it needs to know within a degree or so. So I tracked some of them down. None of the stations are major data sources, so they could have been omitted. Many were in a group of Argentine stations from the early Smithsonian data set. But I tracked the lat/lon to an accuracy that was adequate for my needs - below the jump.

Thursday, July 28, 2011

An update on the previous post.I've realised that I can provide a lot more information by using folders in KML. I have made a file where every station is assigned to a folder according to the year in which data ceased, or 2011 if it is still current. The years are then grouped into folders of a decade or so.

The use of this is that in Google Earth, you can make the folders and subfolders invisible (hierarchically). On the left, near the top, there is a box called places. If you have brought up the station KML file, there will be a box called Temporary Places. If you open that you'll see the filename (crutem_end.sav), and below that you can expand to show the directory. Beside each entry is a little screen that you can click to toggle visibility.

So if you want to see which stations disappeared in the '80's, just blot out all other top folders. If you want to see what happened in 1988, go down to that level. If you want to see all current stations, blot out everything except 2011. And so on.

There is another file (crutem_start.kml) foldered according to start year, which you can treat similarly. It is an interesting historical exercise to click through as the US, say, gets populated with stations in the 1850's.

I'll put up similar GHCN files to replace the less efficient ones from last year.
I've replaced the contents of the crutem_all.zip file with these.

I've made KML files for the CRUTEM3 stations. They are on the document store as crutem_kml.zip. If you just extract the files, you'll find two kml files - crutem_all.kml and crutem_recent.kml. The first shows all stations that have reported since 1900, and the second stations that have reported since 2000.

They show up as yellow pushpins - the size indicates the total years in the record. CRUTEM3 stations have longer durations than GHCN - so the largest size is over 90 years, and the smallest under 50. There are, in the inventory, 1943 stations over 90 years, 1198 under 50, out of a total of 5112. This is the count of lines in the record - it overstates the length because there are some years entered in the file with no data. (Update - it seems I was wrong about that - it's not overstated).

If you click on a station a flag pops up with a few more details. Here is a sample view of stations that have reported since 2000:

CRUTEM station data has been released as noted here. I'll be doing a run of TempLS to check what sort of global averages and trends it produces, with comparisons. And I'll try to produce some Google Earth files.But firstly, just some station numbers data and maps, below the jump.

The general layout of the file is easy to use. They mix the inventory data with the temperatures, which takes some sorting out. For some reason, they give negative longitudes.

Saturday, July 23, 2011

I've been experimenting with Steven Mosher's R package RghcnV3. It has many useful features, and I think the R package format serves it well. I'm not as enthusiastic as Steven about the zoo and raster structures, at least for this purpose, but still, I may yet be convinced.

Anyway, it certainly gets some of the messy initial data structuring out of the way. So since I am currently working on Ver 2.2, I thought I would put together a very simple version using some of the more recent ideas, in the RghcnV3 style.Update - I've added a version of the code with detailed comments at the end.

Friday, July 22, 2011

An ability to model global temperatures with monthly resolution, rather than annual. You can also ask for a selection of months - say, summer.

The graphics is internally reformatted so that each plot is generated by a function from a data list, which is available after the run. That means that plots can be tweaked, merged etc.

Instead of providing just one data set and inventory, you can provide several, and they will be merged. In practice, at the moment, that means that land sets (eg GHCN2, GHCN3, GSOD) and SST sets (HADSST2, HADSST3, ERSST) can be combined as desired. That will allow comparisons (a future post). It's also easier to mix in, say, USHCN.

A new weighting function. Previously there was uniform, 5x5 lat/lon cells, and the adaptive methods ( triangular mesh and Voronoi). The complicated ones are good, with one cell for each station in each month, but time-consuming. So I made a new one, which has latitude bands of near-squares, but added the capability that empty cells would have potential weightings distrubuted among neighboring cells that do have stations (those stations are upweighted). This removes the bias caused by empty cells.

A lot to write about, but for the moment, I'm exploiting the ability of the monthly scheme to report the current month. Fixing bugs kept me from getting ahead of the majors this month, but maybe next. I've added TempLS to the regularly monitored set.

The spatial methods work, too, and I'll show a comparison of June with Gistemp below the jump. But here's the most recent time series plot:

Sunday, July 17, 2011

The NOAA June global surface temperature anomaly is out now - up from 0.497°C to 0.579°C. That's quite high.

I don't normally post for each monthly temp, but I wanred to mention this one, because it's the first new monthly reading since I put up the monitoring post. And yes, it shows up in bold as I hoped it would. I'll try for colors next.

I was also watching carefully because I'm about to post a new version of TempLS (V2.2) which among other things will allow TempLS to produce real-time readings, probably quicker than NOAA. But for that I have to use ERSST data, which is prompter that HADSST2. I'm less familiar with it, and I'm finding a big dip around 2006. There was a changeover then in using satellite data. Anyway more recent output seems OK. So I'll check a bit more, but if I haven't made a mistake (that I can find) I'll probably post tomorrow.

Thursday, July 14, 2011

Global reconstructions with the new HADSST3

There is a new version of sea surface temperature records: HADSST3. It is reported here, commended here, and criticised here.

The most noted change relative to HADSST2 is in a post-WW2 period when there was an issue with bucket and engine intake readings. HADSST2 was said to have a spurious dip in this period, which HADSST3 corrects. Consequently, trends of SST calculated over the last fifty years, say, have been slightly reduced.

I have not yet seen it used in any global land/sea index calculations. I have seen the effect inferred, but not measured.

Well, this is something TempLS V2.1 can do. So I downloaded HADSST3 from the Met Office site. Currently, it only goes as far as 2006. I compared it with using HADSST2, both in association with land measurements from GHCN v2.

Since the effect on trend has been a point of argument, I've made a plot of the trend calculated between 2006 (last year of HADSST3 currently) and years starting before about 1990, going back.Update - I've added a table of trend values at the end of the post.

Update - since a lot of blog discussion talks of the % change in trend in using new HADSST3, I've replotted the last two plots of this post (qv) in % terms - change in the trend measured back from 2006, for both the SST average and for global land/sea. Here it is:

Wednesday, July 13, 2011

Since my recent experiments with JavaScript, I've been teaching myself various tricks at w3c Schools. This has sorted out all kinds of modern mysteries for me. They cover JavaScript, HTML and variants, XML, PHP, SQL, CSS and all manner of things I hadn't heard of. And what I really like is that each command has a try-it page where you can tinker with the text and see the results in another part of the window immediately.

Anyway, that encouraged to do something that I tried last year - putting up the latest sea ice and global temperature numbers. But then I did it manually and it got to be a drag. Now I'm hoping I can fully automate it, using R as well. So here's a start. (Though Neven has the full ice story).

The automation mainly consists of an embedded html window in which the numbers will appear. I'm planning to run the script at least once a day, and about 1.30pm Japan time, when the JAXA ice numbers appear. It will check for the latest Had/UAH etc numbers. New numbers will appear bolded or colored. And the graphs will be updated.

So below the jump are firstly the numbers - small tables of recent results immediately visible, but larger tables if you scroll down. Then come the graphs.

Sunday, July 10, 2011

This is another in the series of enhanced proxy temperature reconstruction plots with hopefully enhanced clarity. This one catches up with a suggestion earlier from Eli Rabett for using Javascript so users can choose which plot to emphasise by mouse rollover. I thought that would be hard, but TheFordPrefect showed me that it isn't so hard, and did a demonstration case. I've adapted that slightly here.

I've also included a plot of Craig Loehle's reconstruction which uses non-treering proxies, and also the original MBH98, which I got from a CA archive. That was for the MM05 E&E paper. I was hoping to include the centred plot from Fig 1c of that paper - there are numbers for that in the archive, but I haven't been able to convince myself that they are really from a centred reconstruction. They seem awfully close to what is supposed to be a non-centred modified Gaspe reconstruction.Update - Craig Loehle's paper had a correction. I've referred to the update paper, but had plotted the original data. I've replaced with the updated data - not a big change. However, the update only goes to 1935, so I've had to change the anomaly base period to 1906-1935. I have updated the zipfile.

I've posted, on the document repository, a zip file called javaproxy.zip. It includes the R files and the data, and a readme.txt file. Despite the name, it also includes the file for the animated gif as well.

I've updated the data notes on the previous post to add the new datasets. Again, all sets have been set to an anomaly base period of 1935-1964 1906-35.

Update.I've added a facility that enables TheFordPrefect's capability of comparing datasets. You'll see now at the end of the legend a block of four rectangles. Think of these as representing a stack of datasets. You can add new sets at the left end by clicking on their names in the legend. When you add a new one, the others move to the right.

So if you want to compare two sets, just click on them, and then roll the pointer over the two left rectangles. You'll see them show up. You can of course select up to four at a time and cycle through them.

Here is the 2000 year plot. Just roll the mouse pointer over the names in the legend to enhance each individual curve.

Further update:I had an update here warning that Firefox 5 (but not IE) seemed to not free the image memory (about 30Kb) used in this javascript. But I now think it does eventually - the garbage collection is just slow.