NIWA’s analysis of measured temperatures uses internationally accepted techniques, including making adjustments for changes such as movement of measurement sites. For example, in Wellington, early temperature measurements were made near sea level, but in 1928 the measurement site was moved from Thorndon (3 metres above sea level) to Kelburn (125 m above sea level). The Kelburn site is on average 0.8°C cooler than Thorndon, because of the extra height above sea level.

I’m not too impressed, especially when you see where the weather station for National Institute of Water and Atmosphere (NIWA) is, right on the rooftop next to the air conditioners:

Note also the anemometer mast, identifying the weather station Click for a larger image

The NIWA climate controversy took a new twist tonight with the release of new data from the government run climate agency.

Reeling from claims that it has massaged data to show a 150 year warming trend where there isn’t one, NIWA’s chief climate scientist David Wratt, an IPCC vice-chair on the 2007 AR4 report, issued a news release stating adjustments had been made to compensate for changes in sensor locations over the years.

While such an adjustment is valid, it needs to be fully explained so other scientists can test the reasonableness of the adjustment.

Wratt is refusing to release data his organisation claims to have justifying adjustments on other weather stations, meaning the science cannot be reviewed. However, he has released information relating to Wellington temperature readings, and they make for interesting reading.

Here’s the rub. Up until 1927, temperatures for Wellington had been taken at Thorndon, only 3 m above sea level and an inner-city suburb. That station closed and, as I suspected in my earlier post, there is no overlap data allowing a comparison between Thorndon and Kelburn, where the gauge moved, at an altitude of 135 metres.

With no overlap of continuous temperature readings from both sites, there is no way to truly know how temperatures should be properly adjusted to compensate for the location shift.

Wratt told Investigate earlier there was international agreement on how to make temperature adjustments, and in the news release tonight he elaborates on that:

“Thus, if one measurement station is closed (or data missing for a period), it is acceptable to replace it with another nearby site provided an adjustment is made to the average temperature difference between the sites.”

Except, except, it all hinges on the quality of the reasoning that goes into making that adjustment. If it were me, I would have slung up a temperature station in the disused location again and worked out over a year the average offset between Thorndon and Kelburn. It’s not perfect, after all we are talking about a switch in 1928, but it would be something. But NIWA didn’t do that.

Instead, as their news release records, they simply guessed that the readings taken at Wellington Airport would be similar to Thorndon, simply because both sites are only a few metres above sea level.

Airport records temps about 0.79C above Kelburn on average, so NIWA simply said to themselves, “that’ll do” and made the Airport/Kelburn offset the official offset for Thorndon/Kelburn as well, even though no comparison study of the latter scenario has ever been done.

Here’s the raw data, from NIWA tonight, illustrating temp readings at their three Wellington locations since 1900:

What’s interesting is that if you leave Kelburn out of the equation, Thorndon in 1910 is not far below Airport 2010. Perhaps that gave NIWA some confidence that the two locations were equivalent, but I’m betting Thorndon a hundred years ago was very different from an international airport now.

Nonetheless, NIWA took its one-size-fits all “adjustment and altered Thordon and the Airport to match Kelburn for the sake of the data on their website and for official climate purposes.

In their own words, NIWA describe their logic thus.

Where there is an overlap in time between two records (such as Wellington Airport and Kelburn), it is a simple matter to calculate the average offset and adjust one site relative to the other.

Wellington Airport is +0.79°C warmer than Kelburn, which matches well with measurements in many parts of the world for how rapidly temperature decreases with altitude.

Thorndon (closed 31 Dec 1927) has no overlap with Kelburn (opened 1 Jan 1928). For the purpose of illustration, we have applied the same offset to Thorndon as was calculated for the Airport.

The final “adjusted” temperature curve is used to draw inferences about Wellington temperature change over the 20th century. The records must be adjusted for the change to a different Wellington location

Now, it may be that there was a good and obvious reason to adjust Wellington temps. My question remains, however: is applying a temperature example from 15km away in a different climate zone a valid way of rearranging historical data?

And my other question to David Wratt also remains: we’d all like to see the metholdology and reasoning behind adjustments on all the other sites as well.

When there is overlap, is it possible that weather stations tend to be moved due to change of usage in location – and that therefore they are more likely to be in UHIs at the time of their switch off. They therefore tend to be warmer than the replacement because they were warming up prior to being turned off. Is there any general tendancy for stations with overlaps to be warmer than their replacements?

The best thing to do would be to treat a new location as a new location. No attempt should be made to fake up the new location as if it were the same. Likewise when parking lots are added or trees cut down then the location should again be treated as a complete new data set. That would be a much more honest and accurate account for the data.

Question is, did the skeptics treat the new locations as such. If they didn’t then they need to.

I have always maintained that the real danger from the Alarmist crowd is that because of their wolf-crying over ‘global warming,’ no one will listen when (and if) the world is ever faced with a REAL man-made environmental disaster.

Following an extensive theoretical analysis, two German physicists have determined that the term greenhouse gas is a misnomer and that the greenhouse effect appears to violate basic laws of physics.

“The point discussed here was to answer the question, whether the supposed atmospheric effect has a physical basis. This is not the case. In summary, there is no atmospheric greenhouse effect, in particular CO2-greenhouse effect, in theoretical physics and engineering thermodynamics. Thus it is illegitimate to deduce predictions which provide a consulting solution for economics and intergovernmental policy”.
This is what Hans Schreuder is telling for years now: http://www.ilovemycarbondioxide.com

Quite honestly, as a total Climate amateur, I could have done a better job of fudging data than these hacks seem to have done, (in both the US and New Zealand!). There isn’t even the veneer of scientific rationale present in any of their “adjustments”. As I’ve said several times over the years, I’m thankful that Cancer Research Scientists are slightly more ethical than Climate Scientists! A “cure” for Cancer would reap much more in $$ than “Global Warming Catastrophe” payouts, so I’m amazed they share data and protocols much more readily. Can any Climate Research actually be replicated?

This is a prime example of the utter disregard for the well-known climate zone theory. Climate scientists seem to not understand this phenomenon at all! Yet zoning of climate has been part of agricultural history for decades longer than many sensors have been in operation. Climate zones are understood by organic typography and altitude, as well as by non-organic typography (land use, urbanization, etc), in relationship to weather factors (weather data including extremes, jet stream location, typical pressure gradients, typical cloud formations, and storm type and history). To fill in missing data or to adjust the data of a new location by altitude alone is so incredibly uninformed that it escapes my understanding of why these people would call themselves scientists.

It reminds me of tree ring interpretations by a scientist who has never cleared or planted trees, nor logged and graded them in his entire life.

What are the odds that these ‘internationally accepted techniques’ are those from GISS, designed by the wonderful James Hanson. The man who has the most to lose if the AGW hypothesis is discredited. The man who has stated under oath in court that it is acceptable to break the law in furtherance of the AGW alarm agenda.

I was under the impression that the IPCC/Hadley/CRU approach was to take a mean for each station over some time period and to track offsets to show either warming or cooling trends. If this is done there should be no need for any such “adjustments”. Is this correct ?

I’m becoming more convinced that the mere act of opening and closing stations can introduce issues. Neglecting to ever run an overlap period, microsite issues, UHI – all serious. But I don’t see any correction for the full amount of “Station Closure Anomaly Adjustment” in the code we’ve got.

If you’re just averaging, you’ll get zero. If you’re calculating gridcells, it is obviously more complex. Pretend for a second that this is one dimensional, and that these seven cells are the only ones used. So, in this simplified gridcell arrangement, you also get zero.

But what happens when, say, Missoula goes offline?

There’s the direct effect: the average is going to shift upwards by (+4/7).
But there’s also the gridding effect: We still have seven cells, but we have no reading for Missoula. How can we get a reading for Missoula? Well, the preferred technique is to find the two nearest know values (Spokane and Bismark) and interpolate. That would make an estimate of 4C for Missoula.
So the effect of dropping a local minima isn’t (4/7), it will be (8/7). The numbers I’ve used are made up – but what happens when you drop the only station in a section of the Rockies?

Fundamentally, this is saying that opening and closing stations directly affects the calibration of the instrument you’re using to determine “US Surface Temperatures.” You can have perfect siting and measurement across North America where every individual thermometer shows a completely flat century trend – and yet have an overall substantial increase or decrease based entirely on dropping (for whatever reason) the correct stations.

The very approach of calculating “US Average Surface Temperatures” from ground based thermometers has issues as implemented.

If they had a reasonable explanation for what they were doing, then those reasonable reasons would not have been produced only after they are caught diddling the numbers according to ‘internationally accepted standards’.
Lodging the weather station next to the AC blowers only demonstrates the utter lack of seriousness with which they treat their responsibility for accuracy and honesty.
Anyone involved in the leadership of the aGW movement, particularly as they get closer and closer to the IPCC, under scrutiny, will prove to be a bad player by the time this is fully exposed.

Rob,
SImply being honest, and showing that data series have windows of accuracy,
would be far too honest of an approach for cliamte scientists.
AGW: The more you know, the more you know you have been had.
AGW: the theory that does for climate science what Lysenkoism did for biology!

Correct me if I’m wrong here, but wouldn’t more thermometers actually give a more accurate reading of what is happening?

For example. Instead of moving the temperature station from where it was at and adding another one actually improve accuracy? After all we are talking about “global” warming. So maybe I just don’t fully understand, but it seems to me the more thermometers they have the more chance they have of actually being accurate.

There is a lot more to go on this story yet. On 27/11/09 Minority party leader, and coalition government Minister, Rodney Hide, requested full details of all data, code, etc from climate minister, Dr. Nick Smith. You can see the full letter at local blog site http://whaleoil.gotcha.co.nz/. Acknowledgement to local blogger, Whaleoil for this post.
Rodney Hide is not a man who is easily fobbed off. He is well known for his effective ‘crusading’ skills. He is also well educated and very articulate. Now that this request is in the public domain it will not easily be dismissed.

I know this is a little OT but I’m really worried.
Has anybody noticed google has removed any search suggestions relating to the AGW scandal? I actually noticed this today because I now have to type in my entire search string. I found someone who actually posted on this.

“Correct me if I’m wrong here, but wouldn’t more thermometers actually give a more accurate reading of what is happening?”

All the science and scientist nervousness indicates straight readings of thermometer records do not give them the warming they want. Thus the resort to adjustments, treemometers and other supposed proxies for temperature change.

To believe NIWA’s excuse of correcting for measurement station movement, you’d have to believe that it was just an amazing coincidence that all the stations moved in just such a way over the decades so as to cancel out all the warming that was simultaneously taking place and leave an amazingly flat record in the uncorrected raw data.

Just call me stupid, but one thing i dont understand that if station no1 and 3 are equal (words of niwa) and you would leave out station 2 (just fill in 999 for every measurement :) ) then you would see no warming. In stead they made the adjustments like a upward sloping line.

In addition to the ‘weather geek’ station surveys (visible & IR photos, Lat/Lon/Altitude, interviews and such), I think it prudent to start asking about temperature sensor calibration. How frequently calibrated? Gain & offset values vs time? Method of calibration (e.g. ice-bath & boiling water method)? External test equipment used in the calibration? Calibration of the External Test Equipment (as we in the colonies call ‘NIST Traceability’ … I should be able to construct a paper-trail from the 1-deg C measured here in Arizona all the way back to the NIST 1-deg C in Washington DC, etc)

Oh! I though that the Internationally accepted technique to adjust temperatures was to use tree rings and readjust the temperature of real thermometers. So, what they really should have done is cut a tree at Kelburn and another at Thorndon and do the Mann trick… Et Voila!

There is a lot more to go on this story yet. On 27/11/09 Minority party leader, and coalition government Minister, Rodney Hide, requested full details of all data, code, etc from climate minister, Dr. Nick Smith. You can see the full letter at local blog site http://whaleoil.gotcha.co.nz/. Acknowledgement to local blogger, Whaleoil for this post.
Rodney Hide is not a man who is easily fobbed off. He is well known for his effective ‘crusading’ skills. He is also well educated and very articulate. Now that this request is in the public domain it will not easily be dismissed.

Rodney Hide is also scientifically trained. He has a bachelor degree in zoology and a masters (MSc) in resource management.

“The Kelburn site is on average 0.8°C cooler than Thorndon, because of the extra height above sea level.”

This isn’t science. There is no “average” based solely on altitude, an average of locations within an air column in a specified time span would not necessarily be the same average of those same points in a different time span. Especially with accuracies and concerns are considered in increments of tenths of a degree, this is pseudoscience, no better than guessing.

Why not just have one station for the world, and determine temperatures for every location on the earth, by an average lapse rate! Were there only one, “the best we could do” would not make it scientific or reality. This adjustment stuff can only be carried so far.

There is in fact a perfectly standard way to handle this problem. Use a third station which overlaps both the old and new stations, and adjust the old so it has the same difference from the third station as the new.

Doing it in this order may not at first seem logical but it avoids having to continuously adjust the new station.

Given that long-term relative temperatures do not vary much over distances of 10s, or even 100 of kms, this method is perfectly sound. If the distance is of the order of 100s of km then it should be done separately for each calendar month.

In this pdf document,on page 40,you can see the view that the BBC gives about ‘climate change’ and their decision not to give equal airtime to sceptics.Interestingly:
Roger Mosey, Director of Sport, said that in his former job as head of TV News, he had been lobbied by scientists ‘about what they thought was a disproportionate number of people denying climate change getting on our airwaves and being part of a balanced discussion – because they believe, absolutely sincerely, that climate change is now scientific fact.

It would also be interesting to discover which ‘scientists’ they consulted to come to their decision (I bet I can guess a few!). Unfortunately, on page 27 it mentions that their 1996 charter requires them to be impartial in regards to ‘CONTROVERSIAL’ subjects.So I think they have a get out clause against prosecution? I’m still sure that this hoax,once fully brought into the public domain will deal a hammer-blow from which this Leftist organisation will never recover.Better late than never!http://www.newsnow.co.uk – this is a nice little news aggregator in which to type ‘climategate’ when you feel the need.

..”NIWA’s chief climate scientist David Wratt, an IPCC vice-chair on the 2007 AR4 report, issued a news release stating adjustments had been made to compensate for changes in sensor ..”

So the organsation that is responsible for the the basic data is also the one taking a leading position at the IPCC where the deductions to be made from that data are being made..and these morons expect us to believe that it is all truthful and above board.

From the point of view of testing the hypothesis that NZ is warmer now than 150 years ago, the siting of stations is an uncontrollable factor – the stations were never set up to show this hypothesis because they long predate it. However, the data could still be useful to test the hypothesis.

If the change in an uncontrollable factor (station siting) biases against the hypothesis (warming), then the scientists just have to live with that rather than adjusting data in the direction of their hypothesis. Were they claiming a cooling, some adjustment might be acceptable, but in any other scientific disipline it is not generally considered good, rigorous science to correct for an uncontrollable factor when the correction is made in the same direction as the effect you are claiming.

On this Thanksgiving Day, let us give thanks that the two greatest all-purpose pretexts for government regulation of every single aspect of your life – “health care” and “the environment” – have now converged. Forget the global warming, global cooling, all the phoney-baloney tree-ring stuff – who can keep track of all that “settled science”? And fortunately we no longer need it, because we have a new rationale for the massive multitrillion-dollar Copenhagen shakendownen. Drumroll, please!

But slashing carbon dioxide emissions also could save millions of lives, mostly by reducing preventable deaths from heart and lung diseases, according to studies published this week in the British medical journal The Lancet.

Government regulation of health care justifies government regulation of the environment: Ingenious!

There adjustment can be tested simply and fairly quickly, by comparing measurements simultaneously at each site. Was that ever done? Or was the change in elevation just assumed to require an adjustment?

There is in fact a perfectly standard way to handle this problem. Use a third station which overlaps both the old and new stations, and adjust the old so it has the same difference from the third station as the new.

Doing it in this order may not at first seem logical but it avoids having to continuously adjust the new station.

Given that long-term relative temperatures do not vary much over distances of 10s, or even 100 of kms, this method is perfectly sound.

But that isn’t true. Many stations only 10miles apart and same elevation have different long term average temperatures which exceed a degree.

“I have always maintained that the real danger from the Alarmist crowd is that because of their wolf-crying over ‘global warming,’ no one will listen when (and if) the world is ever faced with a REAL man-made environmental disaster.”

Absolutely! I have already read one article beginning with the sentence, “I’ll never believe anything atmospheric science has to stay again…” Never mind that meteorologists, in general, have been the leading skeptics — we’ll be the ones to take the hit in credibility from all this.

The CRU scandal has already ensnared Britain’s leading climate “scientist” Phil Jones (whom one principled leftie says has only “a few days left in which to make an honourable exit”) and his American counterpart Michael Mann (as in “Mann-made global warming”).

Given that these two men and their respective institutions are the leading warm-mongers on the planet, and the guys who dominate the IPCC, Copenhagen et al, it would be most unlikely if the widespread data-raping were confined only to the United Kingdom and the United States. Here’s an interesting snippet from my colleagues at Investigate magazine in New Zealand re their National Institute of Water and Atmospheric Research:

NIWA’s David Wratt has told Investigate magazine this afternoon his organization denies faking temperature data and he claims NIWA has a good explanation for adjusting the temperature data upward. Wratt says NIWA is drafting a media response for release later this afternoon which will explain why they altered the raw data.

At least they only “altered” the data, unlike the CRU, which managed to lose it.

Upon examination of said “raw data”, it seems that the country’s temperature increased 0.06° over a century – ie, nada. But by the time Dr James Salinger (a big cheese at NIWA, the CRU and the IPCC), had “adjusted” the data New Zealand was showing an increase of 0.92° – ie, some 15 times greater than the raw data showed. Why?

It might be that “climate change” is an organized criminal conspiracy to defraud the entire developed world. Or there might be a “good explanation”. I’d be interested to hear it. Fortunately for NIWA et al, among the massed ranks of “environmental correspondents”, plus ça climate change, plus c’est la même chose.

In light of what we’ve learned from the emails about the inner workings of these nests of gov’t scientists. I think it’s reasonable to think that there is nothing sinister going on here. They honestly attempted to make adjustments, they tasked it out to someone in the Institute who did do it, but that person no longer works there and never documented a rationale for the adjustment. It makes what they have hollow so of course they want people take what they have at face value.

Anthony – the Khyber Pass station is a good example of a non-compliant climate station. Was a survey done at Kelburn that could be used to evaluate whether we might be looking at similar UHI / microsite effects?

Oh! I though that the Internationally accepted technique to adjust temperatures was to use tree rings and readjust the temperature of real thermometers. So, what they really should have done is cut a tree at Kelburn and another at Thorndon and do the Mann trick… Et Voila!]

Perhaps this is one of those questions that will seem obvious to some, but with all the problems with temperature datasets, is there any push (especially by the skeptic camp) to come up with proper datasets that are both open to the public and well documented? Would this not be a project both camps could agree on?

The exact location of the Wellington airport weather station could be interesting. A look at Google Earth will show that the airport, while close into town, actually sticks out into the sea at both ends. You wouldn’t want to overshoot, especially on the north end because the runway finishes in a fifty-odd foot high ramp. A bit like a giant tank trap.

Wellington isn’t called ‘windy Wellington’ for nothing and the airport is very exposed to the southerly in particular. Landing there is one of NZ’s major adventure tourism attractions, especially in a small plane in a Nor’wester. So location and airflow mean that any UHI effect could be pretty sporadic. Also, although Wellington calls itself an International Airport (lots of NZ airports like to do that – it makes them feel good) almost all flights are local and rarely does anything bigger than a 737 land or take off.

That having been said, I’m really looking forward to the rest of the excuses for the rest of the stations, and if Rodney (Dancing With The Stars) Hide is on the case, then look out NIWA!

Oh! I though that the Internationally accepted technique to adjust temperatures was to use tree rings and readjust the temperature of real thermometers. So, what they really should have done is cut a tree at Kelburn and another at Thorndon and do the Mann trick… Et Voila!]

Glen, you are of course right, you can have big differences but that is not important. Let as say the Station A has been replaced by Station B with no overlap but that both have an overlap of some years with station C. You compare difference in average between both Stations A and B with Station C. Let us say that Station A is 0.8 C warmer than Station C and Station B is 0.3 C cooler than station C. This means that Station A is 1.1 degrees warmer than Station B. That gives you your correction.

How does the computer code correct tree ring data for those trees planted near some sort of thermal mass or heat source? Those elms on the lawn at Old Main must benefit from the local change in growing season, no?

Even if one could completely trust the temperature sensors as providing “hard data”, which can be a challenging proposition at best due to localized heat island effects like being positioned next to heat sources like building air conditioning equipment and site changes not to mention issues with the devices accuracy and calibration, have the “soft data mannipulations” and adjustments seems to be a major source of human induced error or misdirection. As we’ve seen from the Climategate affair the alleged scientists involved (are they still scientists if they don’t follow an ethical and verifiable scientific method?) “misdirection” is now not just a very real possibility but a high probability especially when those involved have deep connections to the Climategate Fellows of Deception.

If I understand what is being said, the entire alleged human caused global warming climate change for New Zealand is from the “adjustments” to the temperature sensors and that without said adjustments the trend is as close to zero as one could expect a Natural system to be if it’s stable?

Can the raw data readings be counted upon as reliable?

How many sites are located within heat islands?

Surely if there are a preponderance of sites in heat islands then they’d have to make adjustments downwards? Wouldn’t that indicate a cooling trend in later years as heat island effects would increase with larger cities and more machinery near sites?

How are the adjustments justified?

Is this the case for the Mann graphs too? Flat trendlines without the adjustments to the data?

If the adjustments to the data are subjective then the entire conclusions are also subjective, and thus subject to human error and foibles.

Imagine if the raw data collected in a clinical trial showed that the study drug had no effect. But the data, after adjusted for problems with the protocol and the measurement instruments showed that the drug was effective. Would the FDA consider approving the drug? Unlikely, but let’s assume there was a slim chance (actually a new trial would be demanded).

But the analogy here goes a bit further. The alarmists are effectively asking that the drug be approved based on the altered data, but they don’t think they need to show an audit trail of the changes, nor explain the methodology for the adjustments. Which would result in the FDA not only laughing the New Drug Application out of existence, but in debarring everyone associated with the research program.

Over at realclimate.org, Gavin said that the reason CRU had not released the global temperature dataset is due to the 1% of the entities who own the data has not agreed to the release. I have 2 suggestions to CRU: (1) release the 99% of the data that is not under the restraint, and (2) let the world know who these entities are so that the public can petition these entities to put the data into public domain.

Jim Steele (12:33:11) : There adjustment can be tested simply and fairly quickly, by comparing measurements simultaneously at each site.

Simply maybe, but not quickly. Here in the SF Bay area with its many micro climates close locations can have vastly different temperatures, with the sign of the difference depending on the time of the year. Oakland hills will be generally cooler than the other side of the hills, except in winter where the bay moderates the Oakland side leading to higher temperatures (at least higher lows). Thus would an overlap need to be for at least a year to determine a full difference pattern?

That’s the only half sensible way to do it,it seems to me Ron.(Doesn’t help with poor maintenance and gradual encroachment of urbanisation and plain stupidity (asphalt,AC etc).Perhaps Mr.Watt or the New Zealanders could do something fairly quickly with that.

The NZ situation is an interesting contrast to the Central England Temperature record. This has been kept since 1659 and shows that there is nothing unusual in variations of temperatures over decades and that there is no significant trend over the last 350 years.

It was taken over by the Met Office who have changed the location of the stations to warmer places, but made no adjustments to take this into account. As a result CET shows spurious warming over the last ten years. The Met has deliberately debased an historic record so that they can say that temperatures in England are higher than ever.

“When contacted by the Mail, the weatherman said he was not allowed to comment and asked us to speak to the BBC press office.
A BBC spokesperson said: “Paul wrote a blog for the BBC website on October 9 entitled Whatever Happened To Global Warming. There was a big reaction to the article – not just here but around the world. Among those who responded were Professor Michael E Mann and Stephen Schneider whose e-mails were among a small handful forwarded to Paul on October 12.
“Although of interest, Paul wanted to consider the e-mails as part of a wider piece, following up his original blog piece.
“Last week, Paul spotted these few e-mails were among thousands published on the Internet following the alleged hacking of the UEA computer system.
“Paul passed this information on to colleagues at the BBC, who ran with the story, and then linked to the e-mails on his blog this Monday.”

How about +/- error on the temperature measurements?
That error should then be carried through to the climate models which are back-cast tuned to match the temperature record.
And then the model error needs to be incorporated into the model output at the year 2100.
No, the only error range that is considered at the year 2100 is the difference between all 21 general circulation models.

And then, the gall to say we are 90% certain the temperature will rise x degrees by the year 2100.

Looking at the charts on the ‘global warming nz2.pdf’ from ‘The New Zealand Climate Science Coalition’ I see the problem! Apart from the Auckland chart which from 1860-1960 is fairly self consistent,it’s all a Donkey’s breakfast.So many ‘difference events’ between the unadjusted and adjusted data on a seemingly 5-yearly basis? I assume all measuring stations would have come under Government control at the same time/standards being laid down.Therefore why is Auckland so consistent for the hundred year period when the other stations are not? – Can’t be equipment changes? You can obviously see when a station was moved by the vertical adjustment line but some appear to have been moved many times and there are lesser adjustments too numerous to mention.I think many will never be satisfactorily explained.

“internationally accepted techniques”…again with the arrogance.
We’re in the middle of discussing how potentially, and international body of scientists used their collective “muscle” to bury the opposition, and this guy tells us that in effect, he has what amounts to their approval to do what he did, and we should all just shut up and go away.

I’m pretty sure that I can get international acceptance on a new technique which REQUIRES YOU TO SHOW YOUR DATA/METHODOLOGY.

When will these ass-hats realize that this is what the whole issue is about?

The fact that the temperature in Thorndon is being read off a roof is effectively meaningless, because in effect NIWA are using the Airport readings now, and the Thorndon series is just a proxy for earlier Airport readings.

But the clever bit off all this is that the movement over time of the Thordon thermometer and the UHI effects mean that it gets a small positive gradient.

So when it is used as a proxy to link the old Kelburn and new Airport values together, it gets to add a quite possibly entirely artificial slope.

Surely the solution is to compare thermometers near the old Thorburn site, the old Kelburn site (before it was on the roof), the current Kelburn site and the airport for an extensive period and sort out the actual relationship . The cost should be peanuts really.

The situation for Wellinton and Christchurch NIWA temps are more complicated when you look at GISStemp
Here the data fore the last 20 years or so is quite clearly the airport temps at both cities
The downtown map coodinates are attached to the airport file in error and I think that this happened somwhere in NOAA. ie Wellington airport data has the coordinates for Kelburn Gardens
DR Wratt knows of the error because I told him. He responded that he did not have resourse to do an audit
On the other hand I keep my emails
What I dont know is whether GISS increased the Wellington ?? last 20 years temps because of the false elevation of 300 feet
PS —NIWA station data are freely availabe by download

Perhaps NIWA have just been responding to all the complaints from Wellingtonians over the years that the Kelburn site, being up on an exposed hill, is not indicative of the temperatures experienced down at sea level amongst the concrete/steel/glass towers of Lambton Quay (a real UHI for sure).

I suspect that the Kelburn site was chosen only because it was on crown land. The Met site was positioned next to the old Dominion Observatory, complete with transit circle, and the Carter Observatory (formerly the National Observatory). Wellington is not only renowned for the strength of it’s wind, but also it’s hilly nature, similar to San Francisco by all accounts. Flat land is scarce and at a premium.

As to the siting of data stations relating to this ‘upward trend’ of the past 50 years. Surely we could look at the data from stations that a.) haven’t been moved. b.) aren’t affected by UHI, even if they don’t go all the way back into the 19th century with the data.

Our local station in Palmerston North would be a good one to try out. Situated at what was the D.S.I.R. (Dept. of Scientific & Industrial Research near Massey University) in the 80’s it was in a paddock about 30 metres away from sealed roads and concrete. It has been moved even further away from anything of that nature since then, ostensibly still the same environment.

“Over at realclimate.org, Gavin said that the reason CRU had not released the global temperature dataset is due to the 1% of the entities who own the data has not agreed to the release. I have 2 suggestions to CRU: (1) release the 99% of the data that is not under the restraint, and (2) let the world know who these entities are so that the public can petition these entities to put the data into public domain.”

Or, better yet, BUY the data from that recalcitrant 1%. There’s enough money in the Climate Research budget to pay for it a million times over. (Just look for spare change under the cushions at the IPCC.)

Patrick Hadley (13:39:43) : have changed the location of the stations to warmer places, but made no adjustments to take this into account. As a result CET shows spurious warming over the last ten years. The Met has deliberately debased an historic record so that they can say that temperatures in England are higher than ever.

Forget the station movements!! THEY CRANKED ALL THE DATA DOWN BY AT LEAST 0.5 DEGREES CELSIUS RIGHT FROM THE START OF THE DATASET.(Apart from Dunedin). It’s so outrageous my mind couldn’t even grasp it!!

RK (13:27:11) :
the reason CRU had not released the global temperature dataset is due to the 1% of the entities who own the data has not agreed to the release. I have 2 suggestions to CRU: (1) release the 99% of the data that is not under the restraint,

“Glen, you are of course right, you can have big differences but that is not important. Let as say the Station A has been replaced by Station B with no overlap but that both have an overlap of some years with station C. You compare difference in average between both Stations A and B with Station C. Let us say that Station A is 0.8 C warmer than Station C and Station B is 0.3 C cooler than station C. This means that Station A is 1.1 degrees warmer than Station B. That gives you your correction.”

Not really OT, since the article above says there is no overlap data.

I suspect your method could well introduce a larger bias than with just A and B having overlapping data. Suppose
Station A is 10 miles from B. Station C is 5000 miles from Stations A and B. So technically there is overlap data. Your method would get a correction, would you depend on it? Who gets to choose which station 5000 miles away is the one used, and on what basis would it be chosen, and how would the result be tested?

Each station has a unique weather condition history, which can not be interpolated from another unless several complete historical factors are known of each station, which do not exist in early historical records (or not sufficiently accurate). This applies even to stations that are 10 miles from one another.

For those interested, I had an email discussion with Dr. David Wratt, Chief Climate Scientist for NIWA, about this issue and he felt there were no issues with the station location. You can read more here:

The offset, at worst, would cause a step offset in the data starting at Jan 1, 1928, either up or down depending on whether they under or over estimated the effect of the altitude change. The “corrections” caused a rising temperature slope over time.

“Am I wrong is seeing this as a red herring?
The offset, at worst, would cause a step offset in the data starting at Jan 1, 1928, either up or down depending on whether they under or over estimated the effect of the altitude change. The “corrections” caused a rising temperature slope over time.
As I see it this station change is irrelevant.”

Others don’t:
“Warming over New Zealand through the past century is unequivocal.”

Look, I spend plenty of time in both Kelburn and the airport. I don’t think the airport is warmer than Kelburn as such, it’s down by the south-facing sea (i.e. cold). The Met Service building up in Kelburn is up at the top edge of the Botanical Gardens, prominent to look at from below. It’s just in a cold windy spot, probably the coldest place in Kelburn. But on a sunny non-windy day, the flat roof of a building is a terrifically warm microclimate. Anyway, it looks like NIWA is being systematic about the temperature differentials, but it’s a shame to have to be suspicious they are cooking the books in some fashion. An agenda is a terrible thing, in science.

Well, Mark Steyn (often a great read!) caught part, but not all of it. Just a few days ago the director (I believe that was position) of our USA Nat’l Institute of Env. Health was on about this same issue. She was saying in a televised interview that reducing CO2 levels actually will not just help the enviro, but save lives – because reducing CO2 reduces heart disease, reduces respiratory disease, even reduces some types of cancer.

Ya, I’m buying all of that, aren’t all of you? I’m just sure they’ve had some good well designed controlled studies of people in low v high CO2 levels, ones that did a fine job of addressing all possible confounding factors too, right?

Ah, here it is, I’d posted (vented about) it to the WUWT tips & notes section on the 25th:

Tonights Jim Lehrer News report, just did interview with I think, Laura Burnbaum, director of Nat’ Institute of Environmental Health. She’s going on about how addressing climate change could not only help us economically and environmentally, but could save many many lives because reducing CO2 and Methane, etc., will, get this, will reduce heart disease, respiratory disease, even certain types of cancers!

Oh! I though that the Internationally accepted technique to adjust temperatures was to use tree rings and readjust the temperature of real thermometers. So, what they really should have done is cut a tree at Kelburn and another at Thorndon and do the Mann trick… Et Voila!]

Ow man, i was just spraying coffee after this remark :)
——

Good! Evaporative cooling!!! Very good against global warming!

::::::::::::::::::::::::::::::::::::::::::::::::::

Oh, gosh darn, I’m so VERY sorry everyone!!! We just missed a chance at some major global cooling from evaporative effects – I didn’t happen to have a mouthful of coffee at the moment I read the last reply. Blame global warming on me, my bad.

I don’t suppose anyone has an idea of how large an effect on global temps this could have? I mean with all the averaging, griding, filling in of missing data, massaging, etc., etc., just how much actual global area is NZ winding up covering/affecting for the last half of the century & start of this?

(p.s., that’s what I was referring to in my previous post in terms of how a couple of evaporative events from coffee spews separated by significant distance might account for global effects, based on the way it appears GIStemps & CRU data is ‘mann-handled’ if you’ll forgive the expression)

I support Wratt. Keep all the data secret. Bury it in the backyard. No, hide it in East Anglia, UK. No, that won’t work. Eat it. That’s what secret agents do. Print the data on edible paper and eat it before the dudes with the badges come knocking.

I just took a quick gander at global CO2 levels versus life expectancy across 100-year and 200-year timescales. I think Burnbaum must have her series upsidedown – or something.

As for the New Zealand data, I’m not sure if anyone has pointed this out but the graph they provided to explain the station change and the adjustment is missing an entire year. 1927 is just gone. Poof. Never happened, even though they opened the new station on Jan 1, the day after the old station closed (Dec 31).

I only noticed it because I assumed the gap was real, and I was frustrated that an entire year was missing.

I’d say they cut it out of the graph to make an obvious, visible gap that would make people think “Gee. The curve could’ve down like they said, because there’s room the gap for a year’s worth of decline.”

whaleoil (14:15:23) :
The building that the station is sitting on was built in the 1970’s so if the station was moved there in 1927 how did it get on top of a 1970’s building?
Or is there another move that NIWA aren’t telling us about?

The post is misleading. The building pictured is not Kelburn, in Wellington, but Khyber Pass, in Auckland.

Rational Debate (17:28:17) : I don’t suppose anyone has an idea of how large an effect on global temps this could have? I mean with all the averaging, griding, filling in of missing data, massaging, etc., etc., just how much actual global area is NZ winding up covering/affecting for the last half of the century & start of this?

I suspect quite a large effect. There is not much land in the southern hemisphere and most of it is Antartica with few thermometers.

We have a large number of thermometers and so I imagine would punch well above our weight. Specially if our “analysis” of the readings take us from 0 warming to 1.9 C.

On the Team side, this may be of interest. Apparently problems have arisen, someone has exposed himself, and Mossman might have had Stevensen screens.

“With the SH around 1910s there is the issue of exposure problems in Australia – see Neville’s paper. This shouldn’t be an issue in NZ – except maybe before 1880, but could be in southern South America. New work in Spain suggest screens got renewed about 1900, so maybe this happened in Chile and Argentina, but Mossmann was head of the Argentine NMS so he may have got them to use Stevenson screens early. Neville has never been successful getting any OZ funding to sort out pre-1910 temps everywhere except Qld.Here’s a paper in CC on European exposure problems. There is also one on Spanish series.”

No temperature records exists from the Thorndon and Kelburn stations in the same year, so NIWA has estimated the amount of adjustment that needed to be made.

Their reasoning is faulty. Thorndon 90 years ago must have ben pretty rural. What NIWA must do is put up a weather station in a rural are very close to Thorndon and see what the temperature difference is over a year between the Airport and Thorndon (which should be cooler) and subtract that difference from the adjustment they have taken.

“No temperature records exists from the Thorndon and Kelburn stations in the same year, so NIWA has estimated the amount of adjustment that needed to be made.

Their reasoning is faulty. Thorndon 90 years ago must have ben pretty rural. What NIWA must do is put up a weather station in a rural are very close to Thorndon and see what the temperature difference is over a year between the Airport and Thorndon (which should be cooler) and subtract that difference from the adjustment they have taken.”

Knowing what the weather is today doesn’t give us accurate information as to the weather pre 1929. Today the average over a year might be .5C but in 1922 it may have been .3C or .7C. This averaging an average to average is pseudoscience.

Nick Stokes (19:08:00) : This post is further misleading. Not only is the building pictured in Auckland rather than Wellington, but the linked documentation describes an air quality monitoring station, not meteorological

The post is not misleading but the picture (maybe). It is not Wellington but gives an idea of the quality of the station.

“..Thorndon 90 years ago must have ben pretty rural. What NIWA must do is put up a weather station in a rural are very close to Thorndon and see what the temperature difference is over a year between the Airport and Thorndon (which should be cooler) and subtract that difference from the adjustment they have taken.”

Knowing what the weather is today doesn’t give us accurate information as to the weather pre 1929. Today the average over a year might be .5C but in 1922 it may have been .3C or .7C. This averaging an average to average is pseudoscience.

Not quite pseudoscience. I dont know how you are mentioning “averages”. Where do averages come in? What we are interested in the difference between a station like Thorndon 150 to 90 years ago and Kelburn of that time. Since we cant have this difference this is the next best thing.

BY THE WAY: NIWA HAS REMOVED THAT PRESS RELEASE FROM THEIR SITE! ITS NO LONGER AVAILABLE AND EVEN A SEARCH DOES NOT SHOW IT UP!

brnn8r (11:53:31) : says “In New Zealand we have a version of the FOIA it’s called the official information act 1982 ……. Would it be worthwhile me testing the waters to see if I can get the data released under the act?”

Yes indeed, please make the request. What happens too often is that everyone assumes that someone else has made the request and nothing end up going in.

The head post says “Wratt is refusing to release data his organisation claims to have justifying adjustments on other weather stations, meaning the science cannot be reviewed.”

Perhaps the reason is that they no longer know why it was done. What they should be able to release is all raw data and the metadata/station history.

Nick Stokes (19:47:47) : Richard (19:26:05) :
Yes, they measure meteorological data. So do I. But it’s an air quality station – there’s no indication that the met data is used for any met or climate purpose

Well, if you treat the station as three different measurements, and just going from the start to end of each series, Thorndon goes up by 0.05C, Kelburn down by 0.4, and the airport up by 0.15. When I add those together I get -0.2C.
Their final graph has it going +1.25C

That’s not how things are done, of course, but it does highlight what a big adjustment they’ve made, possibly bringing new dangers to New Zealand’s farm economies throughout WW-I and the roaring twenties. *gasp*

You know, if a stock analyst kept applying adjustments to companies’ past performance numbers, somebody might take serious issue with it, to say the least.

Much of this discussion has been around how to get accuracy at a single station. The main published records use around 5000 records. Ideally for quality control each station should go through a rigorous process. Difficult enough in New Zealnd which has good metadata – but almost impossible in countries which have civil wars and or major upheavals.

RK (13:27:11) :
the reason CRU had not released the global temperature dataset is due to the 1% of the entities who own the data has not agreed to the release. I have 2 suggestions to CRU: (1) release the 99% of the data that is not under the restraint,
———-
[response by bill]:
“This has already been released by giss for many years”
==========

I thought that CRU had released its data only confidentially to a university in Georgia, and was still stonewalling requests for its data on the grounds of ownership by others. It may be that a statement that the data CRU is holding back is the same as that GISS uses would be sufficient to proceed with an evaluation of the CRU dataset. But until that’s said, or CRU releases the data, critics can’t “get a handle” on what they’re attacking.

NIWA fudging data is not a new issue, a couple years ago I tried to reconstruct their temperature record after I saw them ignore a local well sited long term record for Christchurch in place of a station moved three times in a small fast growing town far out of Christchurch. The graph they showed in the MfE report was used as evidence of global warming, but I noticed they had also truncated this station to hide the warmer than today temps in the 1940’s. This led me to do a total construction for NZ. Using fudged GISS data I got a very small warming trend, much less than theirs, using their raw data I got no trend, and when I adjusted the raw data for shifts as they claim they do, I still got virtually no warming. They dont have a leg to stand on! PS I have pics of several well sited NIWA stations. You can access their raw data for free from Cliflo on their website.

The layman tends to be torn between one point of view and another opposing view when reading conflicting reports.
The initial report on the NZ temp graph showed how the raw data showed no growth while the adjusted data showed a growth.
The approach taken was plausible understandable and believable.
Quite clearly the raw data showed no growth but the adjusted data did and the conclusion was that this was deliberate and an attempt at deception in the name of AGW.
The rebuttal has been strong.
It has not been quite so moderate in its language and it has attacked the debunkers and their qualifications.
But again, it seems to be robust, i.e. sound. They explain how many such temperature adjustments are necessary, it attempts to suggest how the corrections are arrived at and examples it with a weather station move.
However, while I tended to initially think, “Oops!” not proven after all” I took another look.
It troubled me that so many weather stations were on airports. Since the 1950’s there has been significant exansion in air traffic and improvements in aerodrome facilities. New hardstandings, more car pars, more buildings and air conditioners and so on.
One wonders if what has been measured deosn’t bear a far greater corelation with the growth and advances in air traffic than it does with climate change or CO2.
Then we have the questions about why the data for rainfall begins in the 1950’s when reliable data from much earlier s available. It is this truncating of the available data that is also a concern and we have seen this used to hide the decline, for example.
But in the end, I revert to the original raw data.
This seems to show, despite station moves etc., no significant increase.
It would seem to me that if the bulk of the data is robust and shows an increase that interpolating data at other sites (which I find hard to understand – if you don’t have measurements, inferring what those measurements would be if you did have sensors seems fraught with risk) would suggest that the adjustment should include an increase.
However, where the bulk of the data shows no such increase, then how can you justify including an increase.
Finally, why should we adjust data?
Why should we need to infer data for any particular site?
I can see a justification for an individual site if you want to try and understand some local climatic or weather situation at that site.
If I want to know how the weather at a location has changed over the years and there is no weather station at that precise location, then I can see the valdity for interpreting or inferring what might have been the history from the history at surrounding sites.
However, when we are trying to deduce global climatic conditions over time and project the future, I cannot be so confident of the rightness of including and manipulating such data. I am especially concerned where the manipulations are of a magnitude greater than the global projections.
Take two stations one reading 4 degrees higher than the next nearest and we want to interpolate the data for a virtual station in between.
WE could simply average the data and say that it is 2 degrees higher than one and 2 degrees lower than the other If we now average the three sites we find no difference than if we averaged the two real sites.
Thus no need for the virtual site.
However, if the correction is more complex and the virtual station data is assumed to be 1 degree lower than one of the real stations and 3 degrees higher than the other then we have added a distortion. The average of the three stations, one virtual, compared to the average of the two stations is no higher. This raises very serious questions about the appropriateness of the corrections.
I think I would far rather see methodology evolve that treats with the real data only but which addresses the missing stations in a different manner than currently used.
It seems to me it has been too easy to take raw real data showing no temperature growth and manipulate such that year of year growth has occurred with no other justification that the temperature adjustment algorithms used. In such a case it is easier (so says Ocam) to belive that the corrections are wrong than to believe the earth is entered into a runaway warming cycle. I should like therefore to see an independent verification of the change, some indirect method i.e one that can show in some related parameter, but one which is not temperature, that because A is responsive to B just as C is responsive to B that where we have missing data in A or where there are anomalies in the A data that we correct them using the C data as a reference.
Hmm.
So what about rainfall?
Lots of claims that Australia, for eample shows serous rainfall disruption since 1950. But that 1950 is claimed to be a false cut-off date. reliable data extends far earlier. So while the post 1950 data might suggest some correlation with temperature growth, the full data set does not.
On balance, i tend to agree with the initial report that NZ temperature records do not justifiably show any evidence of global warming. It is an artifact of the correction and nothing else.

I see there is still no retraction of the claim that the apparatus pictured in this post is a NIWA weather station, rather than part of an air quality monitoring station.

REPLY: tell me Nick, if you wanted to get accurate meteorological data, to use in conjunction with a science study of air quality sampling, would your first choice be meteorological data from the top of a building near a/c heat exchangers and vents? if so please describe then how it is OK to use data gathered in such a questionable environment to support a science study of air quality. That aside, are you certain the data from this station is not used for anything else? Say maybe for a FILNET like algorithm for missing data from other stations? The point being made here is that its a bad place for a weather station, period. – A

“Glenn. If the third station (Station C in post above) with an overlap was 5000 miles away it would not work. But in New Zealand stations are never 5000 miles apart.”

Then how close is “good enough for government work” to apply a standard rule of temperature/altitude? The “adjustment” was .79C:

“Wellington Airport is +0.79°C warmer than Kelburn, which matches well with measurements in many parts of the world for how rapidly temperature decreases with altitude.”

NIWA claims “Thorndon (closed 31 Dec 1927) has no overlap with Kelburn (opened 1 Jan 1928)”. Yet the startpoint for Kelburn and the endpoint for Thorndon shows only about .2C difference on the first graph:

I suspect something is very wrong here. Were Thorndon to be adjusted down by .2C to .5C there would be no upward trend from 1910 to the present. Thorndon shows no trend up or down for a 15 year span ending in 1929, Kelburn shows no trend up or down for 15 years starting in 1929, ending around 1945.

There was a drop in temperature. Even by NIWA’s second chart, temps quickly dropped .3C or .4C right after Kelburn took over, and stayed low till around 1945 when temps increased from 1945 -1960 (app) back up to around .4C above pre 1929.

Kelburn shows no increased trend from 1960 to the present (end of charted date).

It appears to me, using *real* data (although admittedly only data inferred from the charts) that the area is the same temp today as it was in 1910, no overall upward trend at all, only a temporary drop in temp in the 30s – 40s.

Anthony,
Air quality measurement (for CO etc) is necessarily the measurement of an artificial environment. If you take other measurements of the sampled air (temperature etc) you want it to be representative of the sample. It is not meant to represent the climate. There is no indication that NIWA has ever presented the data as representative of the climate.

That said, it’s not clear to me that the box depicted is even a met measuring device. What is your evidence?

Anthony,
No you’re ducking. You said in your post:the weather station for National Institute of Water and Atmosphere (NIWA) is, right on the rooftop next to the air conditioners:
It isn’t a weather station, and it isn’t NIWA. It is run, and the data belongs to, the Auckland Regional Council, which has nothing to do with meteorology. They measure peak pollutant levels.

And there is still no proof that the box you’ve circled isn’t just another aircon.

REPLY: The volunteer surveyor who took the photo circled the box, and if you zoom in you can see the gill IR shield on the pole, there are two others on poles also. Bear in mind the surveyor interviewed there at the NIWA building.

From the PDF survey form by the volunteer who took the photographs:

Address
National Institute of Water and Atmosphere
269 Khyber Pass Rd
Newmarket, Auckland

And…right here it says the same thing, same address. It’s NIWA, read it on this link:

Marina Baldi, author #4 has an address there.
4 National Institute of Water and Atmospheric Research Ltd., 269 Khyber Pass Rd, Newmarket, Auckland, New Zealand

Seems plausible that if they are studying the output of 143 weather stations in the Pacific Basin and writing papers about it originating from that address, that they might have a weather station there at their office? That NIWA office?

So contrary to your claim of “..and it isn’t NIWA” it is in fact NIWA, and it is in fact a weather station, complete with 3 gill IR shields for measuring temperature, and/or humidity/DP and an anemometer/windvane on a post. Yes they also measure air quality there. But they also measure meteological data.

And you still have not answered my question, do you think it is a good idea to measure meteorological data (for ANY purpose) on a rooftop next to a/c heat exchangers and exhaust vents?

Nick Stokes (14:45:40) : ..It isn’t a weather station, and it isn’t NIWA. It is run, and the data belongs to, the Auckland Regional Council, which has nothing to do with meteorology. They measure peak pollutant levels.

And there is still no proof that the box you’ve circled isn’t just another aircon

Nick Stokes (19:47:47) : Richard (19:26:05) : Yes, they measure meteorological data. So do I. But it’s an air quality station – there’s no indication that the met data is used for any met or climate purpose

“The New Zealand Climate Science Coalition welcomes the commitment yesterday from Dr David Wratt, Chief Scientist (Climate) at NIWA, to, in his words, provide “robust information to help all New Zealanders make good decisions,” said Coalition secretary, Terry Dunleavy.

“Our coalition would be happy to assist NIWA in making this information widely available,” said Mr Dunleavy. “Nothing released by NIWA so far allows their methodology to be replicated easily. So in order to not waste valuable NIWA staff time, we suggest NIWA release the procedures and allow others to replicate the work. The coalition is quite happy to post the procedure and its replication online to allow public audit.

“In the spirit of his commitment yesterday, we call on Dr Wratt to answer the following three questions.

“1. The graph of the New Zealand temperature record on the NIWA website is based on just seven weather stations. What, precisely, gives NIWA confidence that they are representative of the whole country?

“2. What, precisely, are the adjustments made to the temperature readings at each of those seven stations and when were they each made? We request access to the raw data involved in the making of these adjustments.

“3. What, precisely, are the reasons each station was so adjusted?”

The Coalition says Dr Wratt’s release mentioned specifically that NIWA climate scientists had previously explained to members of the Coalition why such corrections are made. Mr Dunleavy comments: “We disagree. We have no record of receiving an explanation. NIWA has in fact refused numerous requests over the years to disclose the corrections. The most recent one was a written request to Dr James Renwick – over a month ago – still unanswered. So we would be grateful to hear what Dr Wratt is referring to, when the information was sent and to whom.

[This was specifically posted on their website. Since removed? Bloody Liars!]

“We further note that access to this information is necessary for others to examine and replicate NIWA’s results, but in any case, it was gathered on behalf of New Zealanders and we are all entitled to see it,” Mr Dunleavy concluded.”

Here it is from their website “NIWA climate scientists have previously explained to members of the Coalition why such corrections must be made. NIWA’s Chief Climate Scientist, Dr David Wratt, says he’s very disappointed that the Coalition continue to ignore such advice and therefore to present misleading analyses.”

Anthony,
If you’re interested in weather measurements, you won’t take readings at a site designed to measure peak air pollution (in busy traffic). The latter is what the ARC monitor, and I suspect the location pictured is the best they could get on the site, to be near the pollution measurement. But to answer your question, if you want measurements that are representative of Auckland’s weather, no, it isn’t a good idea. That isn’t ARC’s role.

Yes, it’s a NIWA building. But it is Auckland Regional Council equipment, and as your linked document unambiguously says, the data is owned by ARC, a municipalal body. There are very good reasons why NIWA would not have a weather station there. It’s in a location of high traffic density, with no open space nearby. Good for pollution measurement, bad for weather.

REPLY: And all I’m saying is that it’s not a good idea to measure meteorological parameters on top of a building. In this case it is the NIWA building. It would seem that if they are doing studies for air pollution, using that meteorological data, it would suffer from the same sort of issues. -A

Richard (15:52:43) :
The site I (and you) linked lists measuring stations by location. And yes, this site is located at the NIWA building. But you’ll see the data listed is only of pollutants.

My point in saying “so do I” was to indicate that anyone can say they measure met variables. It doesn’t indicate that I (or the ARC) operate a weather station. I measure temperature and pressure for my own curiosity.

Nick Stokes (17:03:23) : Richard (15:52:43) : My point in saying “so do I” was to indicate that anyone can say they measure met variables. It doesn’t indicate that I (or the ARC) operate a weather station. I measure temperature and pressure for my own curiosity

I would just like to point out that the temperature readings used by NIWA don’t come from their building – the photo above does show their building and weather station, but the official data comes from the Kelburn weather station which is situated alone in an enclosure at the top of Kelburn near the MetService Building – it is not mounted on a roof or near exhaust from air conditioners.

Glenn,
Sorry about that faulty link. Yes, it is the one you listed. And just above the map:
Data Owner ARC,
************************

This is nitpicky, since it doesn’t matter in “the real scheme of things”
but that isn’t quite unambiguous, Nick.
My doctor is the data owner of my medical record.
The meaning of the term in NZ, and on that webpage, may not mean what you think it means. It could mean that ARC has the responsibility and authority to take and keep official measurements for NIWA. Regardless, ARC is obviously providing NIWA with temperature data, or have been since the 7th of Nov, according to the data availability page. Look up the data yourself

jfrater (18:02:05) : I would just like to point out that the temperature readings used by NIWA don’t come from their building – the photo above does show their building and weather station, but the official data comes from the Kelburn weather station which is situated alone in an enclosure at the top of Kelburn near the MetService Building – it is not mounted on a roof or near exhaust from air conditioners

jfrater – are you somewhat challanged in English comprehension? Is English not your first language? Just read the correspondence above and try to understand it. Take your time. Use Google translation if necessary.

Glenn,
Well, we could go round and round with tehnicalities. But the site is clearly placed to suit ARC pollution monitoring requirements, which constrains their choice of places for measuring temp etc. I think they did the best they could.

Glenn,
Well, we could go round and round with tehnicalities. But the site is clearly placed to suit ARC pollution monitoring requirements, which constrains their choice of places for measuring temp etc. I think they did the best they could.
******************

I did the best I could when I rolled a truck down the mountain. I think you hit the nail on the head, you’re a round and round guy. I seriously doubt you have any idea why the ARC building site was chosen or whether it’s the “best” site for monitoring air pollution, or whether they even know that is true. But that wouldn’t constrain either ARC or NIWA’s choice of places for measuring temps.

Quit while you’re behind. Almost everything you have said on the subject has been flatly wrong, and much more misleading than a possible criticism of Anthony’s choice of language. I for one did not interpret the picture header as implying that NIWA only had one station in New Zealand. And it was apparent to me that he was reacting to NIWA’s claim “NIWA’s analysis of measured temperatures uses internationally accepted techniques, including making adjustments for changes such as movement of measurement sites.”

jfrater (18:02:05) : I would just like to point out that the temperature readings used by NIWA don’t come from their building – the photo above does show their building and weather station, but the official data comes from the Kelburn weather station which is situated alone in an enclosure at the top of Kelburn near the MetService Building – it is not mounted on a roof or near exhaust from air conditioners

“jfrater – are you somewhat challanged in English comprehension? Is English not your first language? Just read the correspondence above and try to understand it. Take your time. Use Google translation if necessary.”

That’s not nice and not fair. jfrater politely explained where the Kelburn temp readings come from, providing meaningful information to the group.

Well I apologise to jfrater for being impolite and thank him for meaningful information on Kelburn temp readings. But he does so in reference to the NIWA Khyber pass building in Auckland and in the midst of discussion about this building, so his information is a little incongruous.

Richard (17:58:31) :
Instead of wasting everyones time on a non-issue, do you have anything to say on Richard (16:17:35) ?Yes. See Steve McI. 2007. He goes through all the adjustments for Wellington, including informative correspondence from Dr Wratt. This information has long been available.

Concerning this “Internationally accepted technique” of adjusting station temperature by altitude:

“NIWA’s analysis of measured temperatures uses internationally accepted techniques, including making adjustments for changes such as movement of measurement sites. For example, in Wellington, early temperature measurements were made near sea level, but in 1928 the measurement site was moved from Thorndon (3 metres above sea level) to Kelburn (125 m above sea level). The Kelburn site is on average 0.8°C cooler than Thorndon, because of the extra height above sea level.”

Looks to be the same attitude – or worse – as Jones etal… I call it “stuff and staff”:

“There’s been a whole lot of work behind this in terms of things like having overlaps between particular stations when they’ve moved. There’s a whole methodology, internationally accepted, where you actually work out how to correct for these sorts of site changes and so on.”

“But you’ll be providing all that shortly?”

“Well, we’re not going to run around in circles just because somebody has put out a press release. We will continue to put out what is reasonable to provide.”

“Wouldn’t it be important –“

“No!”

“…for people to see the comparison studies between both sites?”

“Look, we’re talking about scientific studies here. I’ve told you we’ll put out information about Wellington. Basically it’s not up to us to justify ourselves to a whole lot of people that come out with truly unfounded allegations. We work through the scientific process, we publish stuff through the literature, that’s the way that we deal with this stuff and I can’t have my staff running around in circles over something which is not a justified allegation. The fact that the Climate Science Coalition are making allegations about my staff who have the utmost integrity really really pisses me off.

Richard (17:58:31) :
Instead of wasting everyones time on a non-issue, do you have anything to say on Richard (16:17:35) ?
Yes. See Steve McI. 2007. He goes through all the adjustments for Wellington, including informative correspondence from Dr Wratt. This information has long been available.
*********************************

Oh no. Steve’s interest was in Hansen’s adjustments and records. He did NOT “go through all the adjustments” in the sense of determining the *specific* changes and reasons for those changes with regards to Thorndon, Kelburn and the airport that were made by NIWA.
This can be seen in Steve’s article by a lack of agreement, argument, or even recognition of David Wratt of NIWA’s quoted claims specifically with regard to the reasons for the adjustment. It’s worth repeating:

“It is time to put on record once again what some contributors here already know: Fitting a linear trend to the entire unadjusted NASA GISS plot for Wellington is physically meaningless. This is because as time progresses this unadujsted plot uses measurements from different sites at different altitudes.

Much of the data used by GISS comes from the site at Kelburn, which is in a park at about 125m above sea level. This data set does not start until January 1928. Before this, the Wellington measurements were taken much closer to sea level. For example from July 1912 until December 1927 they were from a site in Thorndon, at 3m above sea level. As you know, there is a drop off of temperature with height which occurs (on average) in the atmosphere.
We have made this point already to some people who continue to publicly misinterpret the unadjusted NASA GISS plot for Wellington in this way.

I should also mention here that there is a further data set from measurements 4 m above sea level at Wellington Airport, commencing in January 1962.

To find the real trend, you must make appropriate homogeneity adjustments which take account of the differences in height between the various measurement series and also use the time when the Kelburn and Wellington Airport temperature series overlap. When you do that, you will find a long-term warming trend for Wellington through the 20th century.”

Wratt makes a similar claim there as has been given recently, defending the adjustment of Thorndon and the airport based on an “average” altitude difference in temperature included as one of the “internationally recognized techniques” used. Steve did not specifically address this issue, agree or disagree with it.

The Climate Coalition requested *specific* reasons. Perhaps you don’t, but I consider adjusting station temp based solely on average altitude differences to be highly absurd, and easily proven a faulty technique. NIWA has said that no adjustments for UHI was needed. I assume they exclude all location influences, not only urban growth they mention as not being a factor. Perhaps Climate Coalition think there is just enough worm room in NIWA’s statements that indicate altitude is not the only reason why Thorndon and the airport was lowered .79C, as do I.
Prodding NIWA into releasing files that would pin them down on precisely how and why this adjustment was made would appear to be Climate Coalition’s goal with respect to this area of New Zealand.

Glenn,
Of course, SM did not agree with the adjustment process – that would be too much to expect. My point is that he had access to the adjusted and unadjusted information, plus the station history story, augmented by the email from Wratt.
This was an issue in Richard’s post – a claim that the unadjusted data was withheld, and the adjustment could not be determined. And this post and its predecessor present the unadjusted data as a discovery. But Steve had it, and analysed it. It’s all on GISS here.

Nick Stokes (11:12:26) :
Richard (17:58:31) : .. do you have anything to say on Richard (16:17:35) ?
Yes. See Steve McI. 2007. He goes through all the adjustments for Wellington, including informative correspondence from Dr Wratt. This information has long been available

Well thats a revelation to me and shows that I shouldnt accept everything at face value, no matter where it comes from. It seems Dr Wratt did have some justification for his claims after all. Though supplying some information to CA is not the same thing as supplying the information asked for to the coalition.

I dont agree with him not supplying the details of the adjustments made to the temperature readings at each of those seven stations and making available the raw data involved in the making of these adjustments.

I think it is very arrogant and wrong for him not to do so, to say the least.

And, as I have pointed out earlier, Wellington airport today is not the same thing as Thorndon a century ago. And they must select a site as similar to Thorndon of a century ago and calibrate the difference between that site and Wellington airport for another adjustment.

Glenn,
Of course, SM did not agree with the adjustment process – that would be too much to expect. My point is that he had access to the adjusted and unadjusted information, plus the station history story, augmented by the email from Wratt.
This was an issue in Richard’s post – a claim that the unadjusted data was withheld, and the adjustment could not be determined. And this post and its predecessor present the unadjusted data as a discovery. But Steve had it, and analysed it. It’s all on GISS here.
***************************

It isn’t clear that Steve McIntyre ever gained access to actual raw data from any station in that area. He had graphs of data, but that isn’t raw data, nor necessarily drawn from raw data. The issue Richard seems concerned with is method the adjustment of data. Steve didn’t “have” that, and your GISS link doesn’t list any of the stations, at least by name, Thorndon, Kelburn or the airport.

Steve provided a link to NIWA (the same as I provided to you) that Warwick Hughes had given him, but apparently discovered the data is only available to registered users, although free, in inquiry form he described as possibly needing “scraping”. Steve put scare marks around “raw” to describe NIWA data. Seems he may only have had access to graphs.

It may be that NIWA database does not hold records for some older stations no longer reporting, they may have archived some of the data or only provide “analyses” graphs. I haven’t checked as I’m not a New Zealander and do not wish to register. But it isn’t clear to me that data for any particular station or all stations are available, or easily available without special request. And you yourself said “No actual data that I could find” after I provided a page that had a link to “data availabilty” right at the top of the page.

Glenn,
The digital GHCN data is at this NCDC site. It includes unadjusted and adjusted. I think Steve M’s conclusion was that NIWA’s adjusted data matched, although he found issues. The station identification is on the .inv files. It’s brief, but enough to identify the site. For example, Wellington is at 128m, which suggests Kelburn (there’s lat and long too).

“To believe NIWA’s excuse of correcting for measurement station movement, you’d have to believe that it was just an amazing coincidence that all the stations moved in just such a way over the decades so as to cancel out all the warming that was simultaneously taking place and leave an amazingly flat record in the uncorrected raw data.”

Not really. There are, what, 7 stations total in New Zealand? It’s a small place, and the uncorrected data is not exactly amazingly flat…

“Why not simply accept that some stations has data for a certain range of years and that other stations has data for other ranges of years?”

Someone else has already pointed this out – because it makes it difficult to do any sane sort of averaging across all of the stations if you only have measurements for some of them.

BTW. the station belongs to the MetService and not NIWA (as does the AWS at the airport).

“Instead, as their news release records, they simply guessed that the readings taken at Wellington Airport would be similar to Thorndon, simply because both sites are only a few metres above sea level.”
No they looked at the period of the airport record that overlaps the Kelburn record as the press release you link to shows.

“is applying a temperature example from 15km away in a different climate zone a valid way of rearranging historical data?”
Do you have any evidence that they actually are in different “climate zones”?

I registered for access on the NIWA database today, curious to see if temp records were available for Thorndon and Kelburn around the time one closed and the other began (as seen in the NIWA graph) in order to verify NIWAs claimed “internationally accepted technique) station adjustment of .79C.

A search of all other stations within the 5 km radius results in NO comparable stations with regard to same time and altitude difference, for ANY period of time.

NO historical data for any station close to Thornton for periods prior to or during the time Thornton was open, at any altitude.

In short, I can’t even verify that Thornton ever took temperature measurements, have no access to any that may exist, no access to any temp from any other station from same time period to make any analysis or comparison between stations such as Thornton and Kelburn (or any others) of temperature by altitude or any other criteria.
This lack of data produced by the database queries may be the result of a lack of user permission, or archived/unavailable records. NIWA does not provide specific reasons for a lack of data on the website.

It’s not unreasonable to assume it true that NIWA has not released raw data or the specific methodology for the adjustments seen here:

What can be inferred from the graph is that Kelburn was only around .2C cooler than Thorndon when it closed, not .79C. If using average temperature lapse rate is the sole method NIWA used to lower Thorndon .79C, it is easily shown that this technique does not hold even close to many stations separated by a short distance and an altitude difference of 125 meters.

To quote from Richard (16:17:35) :

“In the spirit of his commitment yesterday, we call on Dr Wratt to answer the following three questions.

“1. The graph of the New Zealand temperature record on the NIWA website is based on just seven weather stations. What, precisely, gives NIWA confidence that they are representative of the whole country?

“2. What, precisely, are the adjustments made to the temperature readings at each of those seven stations and when were they each made? We request access to the raw data involved in the making of these adjustments.

“One would have thought that as well as adjustment for elevation there would have had to have been an adjustement for the heat island effect at some stage in the Kelburn data.”

To my eye, an increasing trend depends on the last “Airport” piece of the graph. Without that increase, the rest of the Airport (app 1960 – 1990) and Thorndon show no increase, by my eye.

I haven’t identified the stations these records are supposed to come from.
But adjusting for UHI and local effects would not be easy, scientifically. Many researchers dismiss or do not take account for UHI, and I have found little literature or tests on the subject. Perhaps getting lucky and finding stations close together to compare may have some weight of argument, but there must be many factors that would need to be taken into account to assign an adjustment value in the absence of that. Of course it’s real and needs to be taken into account, and no one denies the reality of UHI etc. So ignoring it to show a trend is IMO pseudoscientific garbage.

One thing I’d like to see is a timeline of jets in the area of a tmp station at an airport, compared to minute by minute temperature readings. Of course, every airport could be different, so no average value could be used for heat adjustments unless a lot of these experiments could be done, and certain parameters defined.

The image at the start of the page refers to an alleged weather station on the top of the Niwa buildings – which are in Wellington and not in Auckland. And the Khyber Pass stations is owned by the council (as the station info shows).
How can the author of this article claim that whatever it is you can see on the image is used to determine temperature trends in NZ?
But never mind, it’s good enough to cast doubt and prove how stupid or -better- malicious these scientist bastards are.

“Unless this data, which appears as raw data, is already adjusted, there is nowhere a difference of .79C between these stations. Have a look”

Funny that. I get an average difference of 0.81 and 0.9 degrees for the maximum and minimum temperatures.
BTW anybody who wants to replicate, be aware that the data for 20091010 is missing for one station.

As for the questions of jets and temperature sensors.
You do realise that these stations are used by weather services, don’t you? That is particularly true for the Wellington airport AWS which belongs to the MetService (not NIWA). These weather services depend on high quality measurements to create their forecasts and thus revenue. They would know if jets (or air conditioning exhausts) would screw up the signal. But still they might also be as incompetent as these climate scientists, so why don’t you write to the Metservice (www.metservice.com) and ask them for the AWS data from the airport and do the study yourself?

“The image at the start of the page refers to an alleged weather station on the top of the Niwa buildings – which are in Wellington and not in Auckland. And the Khyber Pass stations is owned by the council (as the station info shows).
How can the author of this article claim that whatever it is you can see on the image is used to determine temperature trends in NZ?
But never mind, it’s good enough to cast doubt and prove how stupid or -better- malicious these scientist bastards are.”

It isn’t an “alleged” station, NIWA collects data from a variety of sources, including metservice and ARC, the author did not claim that “whatever it is you can see on the image” is used to determine temp trends, but the picture *is* a good example of incompetence.

“Unless this data, which appears as raw data, is already adjusted, there is nowhere a difference of .79C between these stations. Have a look”

“Funny that. I get an average difference of 0.81 and 0.9 degrees for the maximum and minimum temperatures.
BTW anybody who wants to replicate, be aware that the data for 20091010 is missing for one station.”

A difference of .1C is a very significant difference, and the data posted is only for a year, and not for the period of the early 20th century. It is necessary in science to be accurate, whether results fit your expectations or not.

As for the questions of jets and temperature sensors.
You do realise that these stations are used by weather services, don’t you? That is particularly true for the Wellington airport AWS which belongs to the MetService (not NIWA). These weather services depend on high quality measurements to create their forecasts and thus revenue. They would know if jets (or air conditioning exhausts) would screw up the signal. But still they might also be as incompetent as these climate scientists, so why don’t you write to the Metservice (www.metservice.com) and ask them for the AWS data from the airport and do the study yourself?

No claim was made of the Wellington airport, but you are correct about airports needing accurate temperatures as they exist in the immediate location of take-off and landing. Of course, a warmer than actual reading will err on the safe side.
AWS in this case may indeed know jets “screw up the signal”, but that does not make them incompetent. Using the data for other purposes might, depending on the purpose.

An interesting bit of information, a site that apparently monitors and analyzes some of NZ weather stations claims there is a .4F difference between two stations at the airport, .3 miles apart:

“It isn’t an “alleged” station,”
So show me where it says what the thinggummy on the roof measures.

“the picture *is* a good example of incompetence. ”
To claim this you’ll have to know what it actually is that one sees on the image and what it is used for.

“A difference of .1C is a very significant difference, ”
Depends on what you want to show.
You where saying that the difference is “nowhere near 0.79″, which is rather wrong.
Besides, you are looking at daily maximum and minimum values. So to be able to actually compare the values from two different stations you have to assume that they are pretty similar (i.e. have their min. and max. temperatures at the same time).
Given the short time span and the fact that you are looking at min/max temperatures, I would say that the data fits quite well.

“and the data posted is only for a year, and not for the period of the early 20th century.”
The data you posted is for a month and not a year.

“airports needing accurate temperatures”
I said that the weather services need accurate measurements.

“Frank, please provide me with evidence (not claim) of which stations NIWA labeled “Airport”:”
I don’t know which of the airport stations NIWA has used in their analysis, but here’s their list of Wellington airport stations:
(search for wellington on http://cliflo.niwa.co.nz/pls/niwp/wstn.get_stn_nodt)

“It isn’t an “alleged” station,”
So show me where it says what the thinggummy on the roof measures.

Reply: I already provided that information once to you in Glenn (17:24:08)

“the picture *is* a good example of incompetence. ”
To claim this you’ll have to know what it actually is that one sees on the image and what it is used for.

Reply: Yes, and Anthony explain what it actually is in Nick Stokes (14:45:40)

“A difference of .1C is a very significant difference, ”
Depends on what you want to show.
You where saying that the difference is “nowhere near 0.79″, which is rather wrong.
Besides, you are looking at daily maximum and minimum values. So to be able to actually compare the values from two different stations you have to assume that they are pretty similar (i.e. have their min. and max. temperatures at the same time).
Given the short time span and the fact that you are looking at min/max temperatures, I would say that the data fits quite well.

Reply: How can you say “nowhere near” is “rather” wrong? In that short time, by your own figures is off a very significant difference in terms of differences of tenths of a degree, which is the scale of temperature trends. Unlike you, I assume little.

“and the data posted is only for a year, and not for the period of the early 20th century.”
The data you posted is for a month and not a year.

Reply: yes, I took a year from the db but only posted a month. Shoot me. Funny you didn’t reply to the fact that the period was not for the early 20th century.

“airports needing accurate temperatures”
I said that the weather services need accurate measurements.

Reply: Well goody for you.

“Frank, please provide me with evidence (not claim) of which stations NIWA labeled “Airport”:”
I don’t know which of the airport stations NIWA has used in their analysis

You didn’t have anything to say in this post, you made no real arguments, you avoided and snipped important points. That’s “unequivocal”.

NIWA MUST REVEAL EXACTLY HOW THEY GET A WARMING OF 1.9 C FROM THE RAW DATA THAT SHOWS NO WARMING. IT IS NOT ENOUGH TO GIVE THE WELLINGTON EXAMPLE. THE WELLINGTON EXAMPLE GIVEN BY NIWA SUFFERS FROM THE FLAW I HAVE POINTED OUT.

1. It was noticed in the raw data, which was available, that in 150 years there was NO warming whatsoever.

2. The ADJUSTED temperatures show a huge warming of 1.9 C over this period.

3. None of the details of this adjustment is available.

4. But when challanged – NIWA puts up a reply saying

a) We have already explained this to you – irrelevant – please send the details again, we dont have it.

b) look in Wellington we had to adjust upwards to the tune of 0.79 C (some merit in this, but flawed as the station used today as a proxy for the one discared in the 1920’s should be a lot warmer – why? because temperatures go up due to heat island effect and not only when you come down from an altitude). Thus the final adjustment will probably be less than 0.79 C

5. You have made an average adjustment of 1.9 C over all the stations. You have pointed out one station where you adjusted 0.79 C. But for the average to come upto 1.9 C the adjustments in the remaining stations have to be much larger.

This raises doubts. They didnt choose a typical station where they had to adjust up by say 2.5 C but one where the adjustment was less than half the average. Were all the stations adjusted upwards? None downwards?

The bottom line:

NIWA must reveal the adjustments made to the temperature readings at all the stations that they have done so, the raw data involved in the making of these adjustments and the reasoning and logic behind each of these adjustments.

Instead of waffling about inane side issues come up with a good reason why they shouldnt do so to remove our legitimate doubts.

Dr Jim Salinger has identified from the NIWA climate archive a set of 11 stations with long records where there have been no significant site changes. When the annual temperatures from all of these sites are averaged to form a temperature series for New Zealand, the best-fit linear trend is a warming of 1°C from 1931 to 2008. We will be placing more information about this on the web later this week.”

Question: How come the complete raw data shows shows NO temperature rise if “measurements from climate stations which have never been shifted” show a rise of 1 C?

Surely these could not be all the stations?

In fact if there is no temperature rise in the raw data then there must be other stations (more than the 11 handpicked) that show a decline to balance out the rise in these stations.

Glenn (10:23:41) : … Has Salinger actually released the adjusted data used in the original reconstruction NIWA claims he holds? By released I mean made public

No I think he (NIWA) needs to do that.

There seems to be some justification in their claim that they did point out to members of the NZ Climate Science Coalition that they had adjusted some temperatures and pointed out the elevation change in Wellington in 2006.

The NZ Climate Science Coalition weaken their case by making wrong allegations in this regard. (maybe bad communication among the members, the left hand not knowing what the right is doing – but whatever it is a bit shoddy it appears)

What NIWA needs to do now is post exactly how they have arrived at their graph. (Make that public, just like their raw data). The complete adjustments, methodology, reasons and logic for anyone to analyse.

Then if anyone can point out any legitimate errors or concerns, these should be addressed by NIWA.

If not we will accept NIWA as being correct, but not until we have examined the adjustments.

The NIWA owes proof that their work is correct to those who pay for it when it has been questioned. They have give a case for Wellington, but only Wellington. And Wellington was not selected by an indpependent party. They picked it.

No reasonable person would accept their work based only on data for one station. They need to provide their explantion of their adjustments for the other 10 stations. Any reasonable person would be suspicious that they have not doen so after several days. After much longer, any resonable person would assume that they are lying.

As a skeptic of AGW alarmism, I find the original attack on NIWA’s adjustment of Wellington data, and this comment section, very discouraging. I fell a lot better when those on my side use valid criticism, especially when alleging intentional scientific misconduct.

The adjustments to the Wellington data are in fact exactly what I would do if the only information I had were the coordinates and the altitude. The adjustment uses the dry adiabatic lapse rate of the atmosphere, which is a pretty good estimate for height differences. That lapse rate is used daily by meteorologists all over the world, and is a strongly established atmospheric constant (it’s a direct result of simple thermodynamics).

While all of the other comments about questionable sites, etc, are valid, the initial attack on the adjustment as being somehow evidence of improper behavior is simply and clearly wrong. The best that can be said is that a better time series could perhaps be achieved by considering other factors, such as local effects.

Glenn makes a valid point (which I haven’t verified, but would be interested in) in claiming to find stations which are close together but differ in altitude where the lapse rate does not explain the variations, but that wasn’t part of the original attack. If that is a normal case, then the simple adiabatic adjustment has higher error bars. However, if the average of the errors is small (the errors have different signs, among other things), then the impact on the larger issues of wide area temperature measurement may be small (the errors would integrate out). Obviously, those making political recommendations based on this data must shoulder a huge burden of proof, which they have failed to do.

I think a more important line of attack in general, rather than questioning the motives of the NIWA scientists, is to challenge the use of surface temperature data in general as a proxy for the warming of the earth. The valuable volunteer “open science” project to analyze the official sites in various countries is helpful in this regard. However, even then, too many people, including skeptics, are clinging to the idea that the Earth’s surface temperature. A far better proxy is ocean energy storage, which we are just starting to measure.

The dry adiabatic lapse rate (9.8C/km) is not the lapse rate of dry air – it is the lapse rate of unsaturated air. It might better just be called the adiabatic lapse rate. There is another lapse rate – the wet pseudo-adiabatic lapse rate.

The difference is this:

The (dry) adiabatic lapse rate is that which you would get by lifting a parcel of air where no condensation occurs. The wet (saturated) lapse rate (~5C/km) is what you get when lifting a parcel of air during condensation. It has “pseudo” attached to it because lifting air which is condensing is not adiabatic – energy leaves the system in terms of the latent heat of condensation.

The dry adiabatic lapse rate is appropriate under the following conditions:

1) no condensation (by definition)

2) the air is adequately mixed (not stratified, no thermal inversion, no differential movement (advection) of air into the column at different altitudes in the area of interest.

Needless to say, these criteria may or may not be met in actual meteorological conditions, and criticism of the adjustment may be appropriate based on those issues (and others – such as the fact that the air is measured 2M above the ground and thus is very sensitive to micro-climate issues such as evapotranspiration, nearby asphalt, etc, or even the intermittent presence of condensation).

John Moore (13:12:11) :
“Glenn makes a valid point (which I haven’t verified, but would be interested in) in claiming to find stations which are close together but differ in altitude where the lapse rate does not explain the variations, but that wasn’t part of the original attack.”

It was included in Anthony’s post. I disagree with your regard for adjusting station temps using lapse rates. If there were no change in the weather, an adiabatic rule could be used to adjust temps to different altitudes. But then whether the weather has changed is what we’re trying to find out. I haven’t checked the actual temp data out, but here’s a post from the more recent WUWT thread:

“Richard (22:32:56) :
… the CliFlo database reveals that Hokitika seems to be made up of Hokitika South from 1866 to 1965, followed by Hokitika Aero until the present. There was a decent 14- month overlap during the closing of South and the opening of Aero.
..
The interesting thing to note is that South is 0.3ºC cooler than Aero. Yet South is at the lower altitude of 4 m and Aero is higher, at 39 m. According to NIWA’s Wellington Thorndon explanation, the higher station should be cooler, based on the expected environmental lapse rate.”

You can look at station details, open this and click “get station list”

Glenn, that is an interesting anomaly. As I said, I haven’t studied the data in general, just the specific critique. There is a reason to raise the temperatures from a higher altitude, absent contrary information. Put another way… if you do that on the average for a lot of stations, you will get better information (by far) than just joining the data without adjustment, and somewhat better information (i.e. >0) than throwing away the earlier data altogether.

Nick,
The lapse rate is the same all the way from dry to saturated. Only at saturation does it change, and then the new lapse rate is the same until the air mass becomes unsaturated. So it isn’t a liitle of this and a little of that. I am not saying (see above) that the lapse rate gives you perfect information, just that it is better than doing nothing at all.

As for homogenising different temperature series, the reason is obvious: to get more information than if you didn’t. It is normal scientific practice in many disciplines, and we do get valuable information form it. Again, that doesn’t mean that this particular homogenization was correct – just that the procedure was an appropriate first attempt (lacking other information, like data from other stations, meteorological information – say, if the second station is often above the cloud bases, etc).

“Glenn, that is an interesting anomaly. As I said, I haven’t studied the data in general, just the specific critique. There is a reason to raise the temperatures from a higher altitude, absent contrary information. Put another way… if you do that on the average for a lot of stations, you will get better information (by far) than just joining the data without adjustment, and somewhat better information (i.e. >0) than throwing away the earlier data altogether.”

There is no reason why you would get better information, nor is there any way to test the accuracy of applying an arbitrary value absent a comparable concurrent record. And I’m not sure the Hokitika example should be regarded as an anomaly. It is an example of a similar location near the ocean in the same region.
Your argument here would apply to a situation where only one station measured actual temps, and all other locations were guessed at using temp lapse rates. I hope you can see from this example why applying an arbitrary value on a larger area would only invite less reliable data. Temps are different even between stations very close together at the same altitude! The two stations at the Airport, same altitude, a third of a mile apart are reported to be averaged .4F different.
You may get better information than without adjusting stations, but then again science isn’t about playing horseshoes, and “close enough” doesn’t cut it. In the Wellington example, we’re looking at a few tenths of a degree in a hundred years making the difference in a trend, and that difference could be, and in this case was, spread over multiple stations. The result could very well be worse than doing nothing at all.

You may get better information than without adjusting stations, but then again science isn’t about playing horseshoes, and “close enough” doesn’t cut it. In the Wellington example, we’re looking at a few tenths of a degree in a hundred years making the difference in a trend, and that difference could be, and in this case was, spread over multiple stations. The result could very well be worse than doing nothing at all.

You get better information because, *on average* the unsaturated adiabatic lapse rate is a pretty good estimate – because the atmosphere tends to be mixed (much more so farther above ground than the 1M thermometers, unfortunately). That does not mean that, for one station, you get better information. It means that, in a large average, you should. Not correcting for altitude, on average, is guaranteed to get you worse data, which is my whole point. But caveat… on any particular case, such as you raise, it may not be good, and may in some cases even be in the wrong direction.

And science is indeed about getting better information. It is only when it is used for policy issues that we have to draw the line and say that better isn’t enough. When we are talking climate policy, the “better information” that I advocate is probably nowhere near good enough.

In fact, I’m of the opinion that surface atmospheric temperature data in general is a lousy way to establish global temperature trends. I would much rather see ocean energy measurement and satellite sounder data than the very noisy and difficult surface thermometry, and before that, dendroclimatological data.

“So if you put all the records into the pot, didn’t join them up, can you still get an average increase in temperature?”

I have no idea. Averaging all the records, without adjusting, would be a silly exercise.

The basic idea that these guys are doing is not unreasonable. It is trying to squeeze as much as they can out of poor data… and scientists have to do that all the time. Likewise, the use of the dry lapse rate is not an unreasonable approximation, if you have no better data.

This leaves two issues:
(1) Are they doing the job right?

(2) Even with it done right, how useful is the data?

The answer to 1 is: I don’t know, but I have seen a couple of suspicious examples.

The answer to 2 is: better than not doing it. Not nearly good enough, as far as I can tell, to support the AGW hypothesis.

Niwa says, rather blandly, that there is no overlap between the Thorndon data and the Kelburn data. The fact is, overlap data was collected in 1927. So why does Dr Wratt not explain what happened to the data?
As an historical aside a biography of Dr Sir Edward Kidson, director of the NZ meteorological service at the relevant period, alludes to the poor quality of records prior to his arrival from Australia in 1927 (the previous director DC Bates having been sacked). What weight should be put on any pre-1927 data if it is unreliable?

What concerns me is that different temperature records are manipulated to get a temperature record for a grid point, that is then further maniplulated, and so on. Each stage moves you farther from reality.

So lets look at some of the things that are done.

Do they really need to be done?

1. Missing data. New values are subsituted.

2. Different temperature records from separate sites are merged to make one.

3. Some sites are dropped.

4. Sites are adjusted up and down. (mainly up it seems)

Of these I can see that there is an argument for 3. You could drop bad sites based on some sort of criteria. Like being urban or not conforming to the standards for measurement.

Time of observation adjustments are relevant.

Adjustments that bias a site are more difficult. I can’t see any argument for adjusting a site that meets the quality criteria. For those that don’t they need to be dropped.

I see no reason for selection one of a choice of two or more just because they are close subject to the other criteria.

I think averaging a months worth of data into one, and then processing even more the results is wrong too.

If we have a site that changes instruments, or moves, treat it as a second series. ie. No join the dots, and no distortions.

Then the deal is that you come up with delta temperature on a day by day basis by averaging all included sites deltas

It’s important that the delta is averaged, and not the temperature. It means you can combine different temperature records without resorting to joining the dots.

I think for the most part what you suggest is equivalent to what is being done.

Averaging the deltas is no different than averaging the temperature, and subtracting the mean. Where there are gaps, you also have gaps in the deltas, and you have to weight accordingly.

As for averaging to compute a grid value – that’s an attempt to account for the temperature of a volume of air (that in the grid). Then the global temp is the sum of all of those divided by the number of grids (or area in grids, or whatever).

In other words, creating grids is one way of dealing with the geographic sparsity and inhomogenuity of data. It seems like a relatively primitive approach – I’m sure there are much better ways of computing the average of a 2 dimensional area with non-evenly spaced datapoints than just putting them in grids.

My guess is the gridding is done for two reasons (and this really is a guess):

1) simplicity – especially before a lot of computing power became available

2) to feed and match models. GCM’s use grids also, so having gridded temperature data lets you match them up.

As for trying to create a time series at one point, rather than using the disparate time series as they appear in the actual record… I would guess that’s also a simplicity sort of thing. Again, I’d bet a good inferential statistician would have a more general way of doing it. However, even then, you have to adjust the individual station records where possible for actual events (changing an instrument, nearby environment changes, etc), and also for longer term stuff like UHI.

So really, what they are doing is a simplistic approach to achieving both station temperature series (which are useful for local climatological data – which is used directly in weather forecasting, among other things) and global averages.

There is an advantage in the simplistic approach: it makes the operation more visible, which is useful for quality control. If you take all the series, feed them into some sophisticated statistical number cruncher, it’s hard to know whether what you got out was right or had artifacts or bugs in it.

Not being familiar with the literature, I would hope there are papers out there using more sophisticated techniques also.

“You get better information because, *on average* the unsaturated adiabatic lapse rate is a pretty good estimate – because the atmosphere tends to be mixed (much more so farther above ground than the 1M thermometers, unfortunately). That does not mean that, for one station, you get better information. It means that, in a large average, you should. Not correcting for altitude, on average, is guaranteed to get you worse data, which is my whole point.”

Lapse rate is influenced by more than one factor, including temperature itself, and in some instances does not apply or is reversed. Lapse rate does not take into account air currents, geographical influences, etc.
There is no such thing as an “average” lapse rate, but if there were it would probably be seen to change as climate changes. Average implies “some more, some less”. But even were an “average” rate applied to a “large” number of stations there is no reason to assume that data would be accurate to any degree, unless the average rate of the “large” area was already known. And that comes back to the simple example of, if an average is known, then one station could suffice for the temperature of a whole region, or the whole earth.
Not correcting for altitude guarantees the best data that can be drawn from records.
Trying to string apples and oranges together doesn’t guarantee anything but garbage.

Sorry, Glenn, but that’s just nonsense. There is such a thing as an average lapse rate. It would *not* change as climate changes, because it is an atmospheric constant that is not sensitive to CO2 changes (or much of anything else).

The average IS known. It is 9.8C/km. You don’t need to go around measuring things to know that, because it is a result of SIMPLE THERMODYNAMICS.

Hence applying that average will improve the average. Not applying it means you have one of two choices:

1) using uncorrected data, which is almost certainly wrong

2) not using the data at all, which reduces the information you have.

Again… we are talking about two things:

1) using averages to improve the signal in averages (in other words, I’m not claiming that using the average works for every case)

2) the thermodynamically derived constant which is the unsaturated adiabatic lapse rate of lifted air parcels.

“Sorry, Glenn, but that’s just nonsense. There is such a thing as an average lapse rate. It would *not* change as climate changes, because it is an atmospheric constant that is not sensitive to CO2 changes (or much of anything else).
The average IS known. It is 9.8C/km. You don’t need to go around measuring things to know that, because it is a result of SIMPLE THERMODYNAMICS.”

The average adiabatic lapse rate is NOT the dry lapse rate, and varies considerably. The dry lapse rate is about 10C/1000m, the wet lapse rate is about 5C/1000m. The average lapse rate is 6.5C/1000m.

But this average is not an average lapse rate of the entire atmosphere of the earth, or of any area in any given period of time, say New Zealand 1912 to 1928. This is what I meant by there being NO “average” lapse rate and that actual rates change with climatic changes.

The claim that an average lapse rate is the dry lapse rate, and that either rate isn’t “sensitive to” other factors, indicates a profound misunderstanding of the subject.
These terms “simple thermodynamic” and “atmospheric constant” you use actually reinforce that inference. There are MANY factors, variables if you wish, that determine and affect actual lapse rates. These factors are the descriptors of a specific climate, or weather conditions. And the climate changes.

Throndon was lowered by .79C, which is a value that corresponds to an average lapse rate drop of 6.5C/1000m. Using the different rates:
Dry: 9.8C/1000m = 1.15C / 121meters
Average: 6.5C/1000m = .79C / 121 meters
Wet: 5C/1000m = .61C / 121meters

I believe WIki is correct:
“As an average, the International Civil Aviation Organization (ICAO) defines an international standard atmosphere (ISA) with a temperature lapse rate of 6.49 K(°C)/1,000 m”
“The standard atmosphere contains no moisture.”

No moisture? Of course its a wrong assumption, and the problem is that the assumption has to be based on the location concerned.

You would have to adjust on a day by day basis using the observed humidity to adjust for the lapse rate.

However they haven’t done that.

However, I still think its the wrong approach. I see no need to stitch up temperature records from different sites, different equipement to make one temperature record for a particular site, (process and adjust the data) and then use this record and further process it.

I haven’t argued for stitching these records together as they have. The need to determine a trend is legitimate, but that must not affect scientific discipline. I disagree completely with what John has said so far, but I don’t think the concept of adjusting temp by altitude is completely invalid. Likely not very accurate, and to be taken with a grain of salt when the adjustment is equal or close to the increasing trend it produced.
NIWA could have looked at rainfall, sunshine data that existed concurrently with temp data and interpolated a difference in elevation. What strikes me though is the number they used (.79C for 121 meters, EXACTLY the ICAO rate), which leads me to suspect they “didn’t need no stinkin’ rainfall records”.

ie. Start with the simple cases and make them more complex and ask yourself, what’s a reasonable and justified approach to working out a trend?

There are lots of ways to make a trend. One does not have to stitch the stations together. However, one does have to adjust for differences in station conditions, except for where there is perfect overlap of time and nothing has changed.

One of those conditions, in the case of interest, where they don’t overlap, is altitude. So the answer to:

“Can you do this without a lapse rate?”

Is: no, not in this case, unless you have better information from some other source.

Take two stations A and B. A is near B. A is lower than B. We have an old record for A, we have a newer record for B. That’s the example in question.

What you are saying is that even though we don’t have a recent record for A, we can reconstruct it by subtracting the appropriate temperature from the record for B. (Standard environmental lapse rate * altitude) You could of course do it the other way round.

1. It’s wrong because lapse rate depends on humidity. The 0.79C is for completely dry air.

2. The lapse rate isn’t constant, its a variable since it depends on humidity and that varies over time.

3. It’s the wrong question. You’re saying we have to answer the question, what was the absolute temperature in the past. Why not just ask the question, has it warmed?

So take two sites.

A with a record from 1900 to 1960

B with a record from 1950 to 2000.

If A warms 1900 to 1960, and B warms 1950 to 2000, we can safely assume that 1900 to 2000 things have warmed. Note that there is no need what so ever to assume lapse rates.

The unsaturated adiabatic lapse rate is correct under the conditions of perfect mixing – i.e. it is the lapse rate you get from non-condensing convection. A real atmosphere has a variable lapse rate that us usually somewhat less than the dry adiabat trajectory. This is because the temperature is not determined solely by convective processes. See below. I goofed on which lapse rate to use, but not on the overall question (at least by this criticism).

————-
I wrote “The average IS known. It is 9.8C/km. You don’t need to go around measuring things to know that, because it is a result of SIMPLE THERMODYNAMICS.”

Glenn wrote:

“The average adiabatic lapse rate is NOT the dry lapse rate, and varies considerably. The dry lapse rate is about 10C/1000m, the wet lapse rate is about 5C/1000m. The average lapse rate is 6.5C/1000m.”

Glenn is correct. The *dry* lapse rate (9.8C) is what you get from purely convective processes – I had confused the average with the thermodynamic ideal, because I use the thermodynamic lapse rate more often – for computing convective instability, where it is appropriate.

I would again re-iterate, however, that the dry lapse rate is NOT dependent on the humidity. The reason it was incorrect to use is not because the air is not dry, but because processes other than pure non-condensing lifting actually go into making up an atmospheric column’s temperature trajectory.

Notice, however, that the average lapse rate gives you approximately their adjustment, however, which is the point. In the absence of better information (comparable stations, specific weather information such as which days had clouds bases below 135 meters but above sea level, etc), adjusting by an average lapse rate is a reasonable thing to do.

Nick,
Adiabatic applies to the air which rises (through convective processes) to the level of the station. This sets the environmental temperature at that level in the general area. If the air at the station “ignores” that, it will rise or sink until the air at that station is at the appropriate temperature. The fact that the air right at the station is in contact with the ground would only be the determining factor if that air was isolated from all the air around it. The real atmosphere doesn’t work that way.

Now, as was properly ponted out above, the dry adiabatic is not the correct lapse rate to use (contrary to my original assertion), because the environmental lapse rate is normally less than that. The correct lapse rate to use is the exact environmental lapse rate at the point at that time, which, of course, we don’t actually know (and if we narrow it to that point, is irrelevant anyway because of the ground contact).

However, if we want to make a time series from two nearby stations which are not at the same altitude, we have to do something to adjust for the fact that, normally, the higher station will be cooler. So what can we do other than make some estimate of the lapse rate?

There are a number of answers to this:

1) don’t join the two series. Just use the differences within the individual series. That, however, leaves a gap around the time when the two stations didn’t have concurrent records. For global temperatures, that might still be okay, though, since not all stations would have gaps at the same time.

2) use the earth’s average environmental lapse rate

3) try to get an even better estimate of an average lapse rate

4) use other stations that overlap the gap and thus give us a clue about the temperature trends at the time of the gap. This is part of the approach used by the GHCN.

I am well aware of katabatic and anabatic processes. Katabatic winds are rather well known if you study meteorology in the US intermountain region where I live. My home is about 400ft above the NWS station in the Phoenix area, in steep hills, and I have felt the katabatic winds in the early evening (it’s really pretty neat – it blows down this canyon near my house, crossing the road in a stream only about 3 meters across, at about 2 m/s). My home is sometimes warmer than the NWS station in the evenings, and cooler during the day.

There are other effects also. I have never said that adjusting station data is either easy or exact. Rather, I said that using a lapse rate in this case is better than doing nothing at all, and more importantly, is certainly not a sign of bad faith data meddling.

“A constant lapse rate leads to an over estimation of the temperature record.”

I have no idea what you mean by that. Do you mean that 6.5C gives an overestimation? Or 10C does? or 1C does? Will it do this for all stations?

Here is my challenge to you:

What would YOU do with the temperature record, given two stations, separated by about 130 meters in altitude, near each other, with no other information, and with an adjoining but not overlapping temperature time series?

“Notice, however, that the average lapse rate gives you approximately their adjustment, however, which is the point. In the absence of better information (comparable stations, specific weather information such as which days had clouds bases below 135 meters but above sea level, etc), adjusting by an average lapse rate is a reasonable thing to do.”

Why is that a point? What evidence is there that the difference between Kelburn and the Airport is a result of adiabatic lapse rate? I’ll remind you that of stations nearby at the same altitude with substantial (relative to the trend) average temperature differences, and that stations close by at different altitudes often have temperatures reversed.

I don’t see anything reasonable about blindly adjusting a hundred year old station without taking into consideration at least some of the “etceteras”, or doing so at all if not. And I fail to see why you don’t understand that changes in weather can affect local temperatures and behaviors.

“What would YOU do with the temperature record, given two stations, separated by about 130 meters in altitude, near each other, with no other information, and with an adjoining but not overlapping temperature time series?”

Wanting something doesn’t justify the use of any trick when you got no information. What would you do if all you had is that the one station opened the same day the other closed, and the temp was .2C different?

At the risk of being repetitive this is not correct. The was overlap – data was collected from Kelburn in 1927. I have no idea what happened to the data but the commentry from Niwa on this point is deceptive. Is it really likely that it was lost?

“I don’t see anything reasonable about blindly adjusting a hundred year old station without taking into consideration at least some of the “etceteras”, or doing so at all if not.

If you know the etceteras, you take them into account. I don’t know any etceteras other than the altitude change. If there are others, bring them out and we can add them to the discussion.

“And I fail to see why you don’t understand that changes in weather can affect local temperatures and behaviors.”

As I have stated over and over again, lapse rate is not the only difference – I even gave an example from my own house of how station data is affected by other factors. Hence I can only conclude that you are not paying attention to what I have been saying.

Glenn… you didn’t answer my question, but rather responded with one of your own.

As for the .2C delta-T – they weren’t on the same day as far as I can tell, so I don’t know how we could use that information. Do you?

As for some trick “when you got no information” – they did have information. They had the altitude difference. That’s not very good information, but it’s not zero information and it is definitely relevant. When you use crappy data, you have to be satisfied with “not very good.”

Notice I am not defending the use of this sort of data for making extravagant global warming claims.

So let me be absolutely clear: I think surface temperature data is noisy, messy stuff. It isn’t very good data. This means that the massaged result isn’t very good either, although aggregates can be better than individual station records for statistical reasons.

It does not mean that an attempt to join two station records is some sort of scientific misconduct or fraud when they use a reasonable temperature adjustment for the differences in altitude – even given everything we know about how that adjustment may be wrong. Nothing I have seen on this thread is evidence of misconduct or fraud – unless I am missing some critical point – which is always possible given limited time and the amount of information flying around.

I am a skeptic of AGW. I just don’t believe in crying wolf when we are really seeing a puppy! It’s more like the behavior of the alarmists.

“As for the .2C delta-T – they weren’t on the same day as far as I can tell, so I don’t know how we could use that information. Do you?”

Well I don’t know about you but if the thermometers were .2C different tomorrow from what it was today, and some guy came along and said no, because of the average lapse rate, the thermometers were really .6C different (and in a different direction), I’d be wondering whether to trust the thermometers or the guy.

Had temps overlapped by one day, what would you think could be done with that?
A week? A month? Temps do not show a divergence of .8C for 4 years, 2 either way, according to either the adjusted or unadjusted graph.

I think we are talking past each other. I have no information that contains an overlap (see: Bob (22:27:21) : ).

Nick:
My question was: ““What would YOU do with the temperature record, given two stations, separated by about 130 meters in altitude, near each other, with no other information, and with an adjoining but not overlapping temperature time series?” Hence your response is not to my question. I have no difficulty with your method for the case that you assert – it is obviously the reasonable way to do things.

However, without no overlap data between Thorndon and Kelburn, we can’t use it.

Glenn, you write “if the thermometers were .2C different tomorrow from what it was today, and some guy came along and said no, because of the average lapse rate, the thermometers were really .6C different” – ys, if that was the case. But we are dealing with .2C a difference between two thermometers, at different locations, on different days? I don’t see any way to use that information.

—

All that being said, I find the statement “Warming over New Zealand through the past century is unequivocal.” to contain way too much certainty and they shouldn’t have said it.

However, without no overlap data between Thorndon and Kelburn, we can’t use it.

Glenn, you write “if the thermometers were .2C different tomorrow from what it was today, and some guy came along and said no, because of the average lapse rate, the thermometers were really .6C different” – ys, if that was the case. But we are dealing with .2C a difference between two thermometers, at different locations, on different days? I don’t see any way to use that information.

But you do see a way to use a lapse rate that doesn’t work even with some known stations in the area? I’m bafflegasted. Yes, my example has two thermometers at different locations and altitude, with temperatures compared from Dec31 and Jan1. If the lapse rate held water, these two should be separated by .8C instead of .2C. And despite the fact that they are different days, as I said, the actual temp data of the two stations did not diverge by .8C for four years, two earlier years of seasons and days, and two later years of seasons and days.
This does not occur on the graph between Kelburn and the “Airport”, and at no point are they different by only .2C. The closest is 1970, when they were about .4C apart. But look at the adjusted graph; the end of Thorndon and start of Kelburn are .6C apart, much less aligned than in 35 years between Kelburn and Airport.

First:
“I’m bafflegasted. Yes, my example has two thermometers at different locations and altitude, with temperatures compared from Dec31 and Jan1. If the lapse rate held water, these two should be separated by .8C instead of .2C.”

How do you arrive at that? Without using additional information, you can’t say with accuracy of even a few degrees about how the temperature on Dec31 and the temp on Jan 1 are related. What am I missing?

Nick writes:

“Instead, keep it very simple. Go and look directly at the change and ignore the absolute temperatures. Use as many sites that meet a quality threshold. If desparate introduce a propert UHI adjustment.”

If we are only talking about Wellington, which is subject of the (IMHO) improper allegation of misconduct, then my #1 argues against “look directly at the change.” If we are talking about lots of stations, then I agree that using deltas is fine (and probably more reasonable), for aggregates, than trying to construct continuous station time series – ASSUMING that nothing changes in any series over which you take the aggregate – such as time of observation, etc. That is a huge assumption, as the discussion on Darwin Zero demonstrates. However, if we are talking aggregates, than the assertion of misconduct in Wellington fails, because obviously Wellington is an example of constructing a single time series, and as far as I can tell, a quite reasonable one.

” the actual temp data of the two stations did not diverge by .8C for four years, two earlier years of seasons and days, and two later years of seasons and days.”

If the two stations have no overlap, how can you say this?

“This does not occur on the graph between Kelburn and the “Airport”, and at no point are they different by only .2C. The closest is 1970, when they were about .4C apart. But look at the adjusted graph; the end of Thorndon and start of Kelburn are .6C apart, much less aligned than in 35 years between Kelburn and Airport.”

We were discussing Thorndon and Kelburn in the lapse rate discussion, not Kelburn and airport.

Let’s step back for a moment and look at their explanation:

Kelburn is 120m higher than Airport, and, no surprise, is about .8C cooler during the time their records overlap. This .8C is, by definition, the environmental lapse rate between those two sites, and is consistent with the earth’s average environmental lapse rate of 6.8C (which gives .816C delta). Hence we have measurement of the lapse rate, and a reasonableness check on that (the lapse rate calculation).

Now, we go to Thorndon. Thorndon does not overlap either time series in the data provided. So what are we to do with it? We can’t just toss it in unadjusted as the front of the time series – that would bias the trend downward when the Kelburn vs Airport delta, and the lapse rate delta, both argue for decreasing the Thorndon temp by around .8C. And voila – that’s what the NIWA did. So what in the world is the crime?

If you didn’t have Airport, you would be stuck with two disjoint series, that are from different places and importantly, at different altitude. There are two ways to use those:

1) compute the trends separately – which gives you a trend from 1913 or so to 1927, and a separate trend from 1927 to the present. The problem with this is it doesn’t tell you ANYTHING about the trend from 1913 to the present. In this particular case, that isn’t a huge deal, since we are only losing 13 years, but in other cases, it could be really significant.

2) combining the two records, with the adjustment. Only by failing to adjust Thorndon does that 14 years count, because now you are adding in what is pretty clearly data that is way too warm. While we cannot know with accuracy what that adjustment should be, we have a reasonable guess – based on the two factors: the airport/kelburn delta, and the average environmental lapse rate.

Wellington’ newspaper The Dominion, 15 Dec 1927, states:—“Kelburn.—We are now getting records from both stations,’ said Dr. Kidson (former direcor of NZ meteorological service), ‘so that some idea of the difference in conditions may be ascertained.’ From the beginning of the year (1928) the station at Thorndon will be abandoned.” This change is being made because the Railway Department requires the site at Thorndon.”

First:
“I’m bafflegasted. Yes, my example has two thermometers at different locations and altitude, with temperatures compared from Dec31 and Jan1. If the lapse rate held water, these two should be separated by .8C instead of .2C.”

How do you arrive at that? Without using additional information, you can’t say with accuracy of even a few degrees about how the temperature on Dec31 and the temp on Jan 1 are related. What am I missing?”

**************
Actually with the data extracted from the unadjusted graph we can see that Kelburn was nowhere near .8C lower than Thorndon for a period of YEARS. The chart shows averages, to answer your question. The average for two years before Thorndon closed is steady at about 13.0C. Kelburn opened at about 12.8C, and wasn’t cooler than .79C (12.2C) till over a year later.

But you didn’t comment on the argument about any point on the graph of Kelburn and Airport being always around .79C different. Why is the point where Thorndon closes and Kelburn begins not at or close to .79C?
***************

“Kelburn is 120m higher than Airport, and, no surprise, is about .8C cooler during the time their records overlap. This .8C is, by definition, the environmental lapse rate between those two sites, and is consistent with the earth’s average environmental lapse rate of 6.8C (which gives .816C delta). Hence we have measurement of the lapse rate, and a reasonableness check on that (the lapse rate calculation).”

**********
Why is it reasonable to assume the .79C difference in stations is a result of lapse rate?
I’d suspect that temps taken around airports would be warmer than in the middle of the Botanical Gardens. It only seems “consistent” because that is the value NIWA used, and it happened to match fairly well.

The Wiki article you quoted before shows a lapse rate of 6.49C/1000m, not 6.8C.
This works out to be the exact figure needed to compute the lapse rate of 121 meters, that the “Airport” station was lowered by .79C, not .81C. In addition to that, the “Airport” is a record of multiple stations, the most recent I suspect as being the one that is hotter by .4F than the other airport location.

“Kelburn opened at about 12.8C, and wasn’t cooler than .79C (12.2C) till over a year later. ” –

But – it *was* cooler a year later, and a lot of other years. Why wasn’t it cooler right on that day, by the lapse rate? Lots of reasons. The lapse rate is an average. Maybe that time of year there are frequently low clouds, which would reduce the lapse rate. Maybe… lots of maybes

“I’d suspect that temps taken around airports would be warmer than in the middle of the Botanical Gardens.”

That depends on lots of factors, just like everything else in this data. Airports in 1927 may have been pretty green and cool. When I was learning to fly, I used a number of grass airstrips.

Airports these days are bigger and more likely to have a lot of concrete and asphalt. So looking back at where those two series almost connect, the airport is not likely to have yet been experiencing UHI effect and most likely didn’t have asphalt runways.

Also, I was unaware that “Airport” was a bunch of stations. Does that mean one station moved around and changed, or a bunch operating at once? Normally, the “airport” temperature is that of some official gauge at an airport, not a bunch of stations, although it changes from time to time (as evidenced from the official reasons for the Darwin station adjustments).

But if you can point me to more details about Wellington AP, I’d look.

Here is my take… I wandered into this discussion after seeing some claims that the Wellington data were clear evidence of malfeasance and bad-faith adjustments. I looked at the posts, especially the graph that is so prominent, and quite the contrary, they looked pretty darned reasonable, from what was presented.

They still look reasonable. So if there’s some reason they are not unreasonable, it must be in data beyond that graph and the little goodie put out by NIWA explaining it all, or we are simply in disagreement about how to interpret all this.

So… are you making your observations based on the same information I’m looking at, or is there more information you are using?If the latter, I’d love to know about it.

“But – it *was* cooler a year later, and a lot of other years. Why wasn’t it cooler right on that day, by the lapse rate? Lots of reasons. The lapse rate is an average. Maybe that time of year there are frequently low clouds, which would reduce the lapse rate. Maybe… lots of maybes”

All I can tell you is that virtually no two points between Kelburn and Airport show more than a .2C relative difference, in over 500 monthly averages. These points do not represent a single day, the chart resolution allows at best a “point” of perhaps a month’s averaged temperature. You’re hung on a “day”, I don’t know why. I provided the specific open and close dates to show there should be no data gap.

Test it – look at the graph, there are virtually no “points” at Kelburn in one month of more than .2C difference to a “point” at Airport in the next month. This is easier to see in the adjusted graph because the two records are aligned (not to mean that adjustment is valid). Check it yourself!

Another way to look at this is to realize that a line linking the end of the adjusted record of Thorndon to the start of Kelburn would be a *vertical line* (instantly) raising temps .6C! Keep in mind that the graph is made from an average of temps, day to day anomalies are not shown. The closest comparison by my reckoning is the Airport dropping in 1962 about .9C in a *year*. But it would have been .6C cooler in an instant before Kelburn opened and stayed that way long enough to show in an averaged graph? Possible, but not probable, and not seen anywhere in the graph.
A weather fluke in 1928? Possible, but not probable to affect an averaged dataset, and not seen anywhere else in the graph.

Also Glenn (15:28:56)
I found a website that lists two temperature station locations at the airport, the site claiming a .4F different average between the two, station WMO Id 93439 Wellington Aero Aws and WMO Id 93436 Wellington Airport.

There appears to be at least three stations involved, with data merged in the database. That surely doesn’t mean the graph was plotted from merged data, even if the only indication was the lack of coverage ’92-’95. No temp data for Thorndon is available from the NIWA website, and the first 3 years of temp data for Kelburn is missing. Apparently NIWA has access to data not in the database, despite their claim that all the data is available to anyone in the database.

In these two threads there are lots of links and information. I’ve made some minor mistakes. But ask and I’ll try to help if you want more.

“You’re hung on a “day”, I don’t know why. ” What I am hung up on is that we do not have any overlap between Throndon and Kelburn. The “day” is the amount of gap between the two and nothing more (Dec 31-Jan 1). If we had a few years of overlap, then all of this surmising would be much less of an issue. It is because there is no overlap that the amount of adjustment becomes such an issue – the overlap would let us calibrate (as it does between Airport and Kelburn); without it, we have to go ad hoc and use something like the average lapse rate (which at least is sort-of verified by the Kelburn-Airport delta).

Okay, I think that without the monthly (or daily) data, we aren’t going to get much farther. To me, that .6C discontinuity does not look visually like a problem. Why should a .6C difference in year-to-year temperature be surprising?

Try this experiment: make the lines the same color, and draw line from the end of Thorndon to the start of Kelburn. I suspect things would not look at all shocking. If I had the data, I’d plug this stuff into Excel and sort this all out and do the graph.

I apologize for not following your analysis of “there are virtually no “points” at Kelburn in one month of more than .2C difference to a “point” at Airport in the next month.” but I can’t read the graph to a monthly level, and if those are yearly averages, we have to understand the averaging process in order to know how they handled the ends of the two time series (running averages have a problem with ends).

The lack of data that you mention at the end is annoying – it may prevent us from being able to analyze this adequately. It does make one wonder how they can draw the graphs with no missing years, when they have no Throndon data and miss the first 3 of Kelburn. That would be a good question to ask them, but I’d bet that at the moment, they are being bombarded with questions about all of this.

I hope that, rather than just shut everyone out, they take whatever they have, in whatever form, and put it out on the web. Otherwise this will just keep on simmering. Ultimately, one good that will come out of this is that people with climate data will be required to make it *all* available, all the time. If nothing else, so they won’t have to respond to a blizzard of requests.

If you can find better data than that darned graph, I’d love to get it.

“Okay, I think that without the monthly (or daily) data, we aren’t going to get much farther. To me, that .6C discontinuity does not look visually like a problem. Why should a .6C difference in year-to-year temperature be surprising?”

Not year to year, if we assume there was no break in coverage. The .6C could not be a one day anomaly, if the adjusted graph accurately depicts temps, Kelburn at open would have *instantly* and for an extended period of time experienced and maintained that amount of increase. It would have had to, to show up in an averaged dataset. It’s too much to expect. And there is no comparable occurrence seen at any time in the 90+ year record of Kelburn. The average drops fast a few times, in periods of year to year, but not in one day! How the data was averaged has no bearing on this. If anything, the problem is only magnified by the scale of the graph.

Were the adjustment accurate, I’d expect to see the end and beginning of the plot lines connect, were there no gap in coverage. Can you explain why they would not?

The whole business is frustrating. In the time NIWA took to respond and accuse interested parties no doubt similarly frustrated, they likely could have just made the actual data available online, with a simple explanation of what data was used, when and where it came from, and some methodology. Instead, they issue what really looks to me like a trick at least with the Wellington reconstruction. The “Airport” is dropped to match Kelburn, so that it looks like an altitude adjustment is valid. They then drop Thorndon the same because Kelburn and the Airport matches so well with the altitude adjustment, and provide no other justification that I’ve seen. They leave a little gap of about a year between Thorndon and Kelburn, with no reason, but state the dates of close and open, which is a day, not a year apart.
It conveniently produces a nice upward trend, though. IMO this all invites skepticism.

“Were the adjustment accurate, I’d expect to see the end and beginning of the plot lines connect, were there no gap in coverage. Can you explain why they would not?”

Yes, I can. If those are annual averages (Jan1-Dec31), then the graph is saying that the last year of Thorndo was .6C cooler than the first year of Kelburn, which is completely reasonable. And, the graph is clearly of averages, because the curve is way too smooth for daily data.

“Instead, they issue what really looks to me like a trick at least with the Wellington reconstruction. “

I don’t think it’s a trick. Unfortunately, because I don’t have anything better to go on than that graph and their explanation, I can’t judge it very well.

“They leave a little gap of about a year between Thorndon and Kelburn, with no reason, but state the dates of close and open, which is a day, not a year apart.”

Yeah, that makes no sense at all.

I just found their temp data site. Back after I’ve done some plotting.

“Auckland is another example where the assumption is unsupported. Albert Park is at
49 m, and Mangere is at 2 m. Theoretically, Albert Park should be colder by about 0.3ºC.
The opposite is true. On average, Mangere was 0.65ºC colder than Albert Park over the
1962 to 1989 overlap period. That means the error is almost a full 1ºC!”

It looks like their graph in their explanation is based on simple Jan-Dec annual averages. Above it was mentioned that apparently more than one station was used to construct the airport data, and that appears to be true. I just used one and didn’t bother to search for the rest.

There is no 1 year gap – Thorndon 1927 mean is 14.0, while Kelburn 1928 is 12.8.

My resulting plot is at

I can’t put my data out there because of their usage agreements, but it is available from their data center at: http://cliflo.niwa.co.nz/

Your statement “All I can tell you is that virtually no two points between Kelburn and Airport show more than a .2C relative difference, in over 500 monthly averages.” appears to not apply, because we are looking at annual averages. For annual averages, the data set does contain inter-annual differences of >= .6C (for example, Kelburn 1981:13.1, 1982: 12.4 delta: -.7c)

“If those are annual averages (Jan1-Dec31), then the graph is saying that the last year of Thorndon was .6C cooler than the first year of Kelburn, which is completely reasonable.”

The graph does plot yearly averages. But I believe you are looking at this wrong. If there is no gap in data coverage, then the plotted point that temps diverge from one year to the next would be the same y-axis point where Thondon ends and Kelburn begins, not a year gap between them.
The beginning of the Kelburn line, the “point” representing averaged data of a year, represents the SAME year as the last year of Thorndon.

The .6C discrepancy is NOT from a continuous record of one averaged year being .6C less than the next year, but instead brings the accuracy of the altitude adjustment itself in question.
This is why I brought the .4C difference in 1970 of the Kelburn and Airport discrepancy to your attention, that it only happened once in 45 years.

For that reason, with the Thorndon and Kelburn adjustment, it is not reasonable to assume the adjustment is accurate or reliable.

Glenn: “The beginning of the Kelburn line, the “point” representing averaged data of a year, represents the SAME year as the last year of Thorndon.”

I disagree. The Thorndon line does not contain any data (and hence cannot have any average over) 1928. Likewise, the Kelburn line does not contain any data from 1927. Thus the .6C delta is from 1927 to 1928. See the monthly plot I just put up (I didn’t see you post at first because I was trying the beat Excel into submission). The point that is the end of the Thorndon line is the annual average for 1927, the point that is the start of the Kelburn line is the annual average for 1928. That’s why I went and got the data myself and plotted it – was to see this.

Nick – agreed – the more overlap, the better an adjustment you can make – up to a point. One day’s overlap would not tell you much – there could be, for example, wind conditions that day that totally change the difference between the two stations. Unfortunately, in this case, using the data I downloaded (I didn’t go for the daily data – I’m taking their word for it as to the start/end dates), there is zero days of overlap. Hence there is no information available from overlap if one just looks at those two stations in isolation.

“Your statement “All I can tell you is that virtually no two points between Kelburn and Airport show more than a .2C relative difference, in over 500 monthly averages.” appears to not apply, because we are looking at annual averages. For annual averages, the data set does contain inter-annual differences of >= .6C (for example, Kelburn 1981:13.1, 1982: 12.4 delta: -.7c)”

The NIWA graph is good enough to see what the max adjusted annual difference is in the record between Kelburn and Airport, as I already mentioned, about .4C around 1970, one time in 45 years. That is what I meant by “virtually” no two averaged years of the magnitude of .6C difference. I was aware that the graph was plotted with yearly averages, and apologize for trying to hammer my point down by stressing how many months are included in over 40 years. But my point still stands.

I’m surprised to find you managed to find the raw temp data from a source on the Internet of Thorndon and 1928, 1929 and 1930 Kelburn stations. Perhaps I wasn’t registered with the adequate “subscription level” (argent) for the NIWA database, although I can access temp data for many stations, including Kelburn after 1930.

That’s a straw man argument. Anyone who claims altitude adjustments always work is as far out of touch as someone who denies the 1.2C/CO2-doubling value from the 1-D radiative balance model.

Of course altitude adjustments don’t always work. We don’t even need to go looking for it – we know that UHI is but one of many factors that can cause errors.

The argument is that altitude adjustments are the most reasonable thing to do with two stations of different altitudes, IN THE ABSENCE OF OTHER INFORMATION. Our discussion on Thorndon/Kelburn fals into that category. That argument also means that the use of an altitude adjustment in that circumstance is not an argument for malfeasance or even sloppiness.

“Notice in this graph, of monthly rather than yearly averages, things look a lot different and the splice still looks reasonable ( I plotted adjusted and non-adjusted)”

Nice graph. But why does this make the adjustment look any more “reasonable”? Your graph makes it impossible to see a real trend, but still reveals at least a .6C gap.

Wouldn’t the graph look much more reasonable if the end and start point of the two stations were the same, since they both represent the exact same time of measurement? Certainly this disparity does not support the accuracy of the adjustment for altitude, and isn’t that what this is all about?

Look at the slope of the line from the left series and how it matches, including the gap, the slope on the right. The .6C “gap” (which is a temperature difference between two different months) fits right in.

Going back to basics… what is it about that .6C gap that remains a problem?

…continuing… Without the .6C gap, the graph would look wrong – the temperature, which was rising at the end of the year (note the annual peak tends to be after the start of the year) would have to stop rising.

If we look at the annual averages, the only “problem” would appear to be that there was a .6C difference between the two years. That simply is not out of the ordinary.

I thiink we are still talking past each other somehow, but I don’t know what the missing link is. Sorry.

“Your plotted graph still shows a gap of a year between Throndon and Kelburn.
If there is no gap in data, there would be no data missing for a year.”

You are looking at it wrong.

My plotted graph shows a disconnect because I plotted two different series, without joining them into one. One ended in 1927, so there is an end of a line there. The next started in 1928, so there is the start of a line there. The “gap” of a year is not missing data, it is a visual illusion caused by my not joining the two series with a line. If you look at the data points (marked with an X, you will see that they are all separated by the same amount.

I’ll respond to your other posts after about 90 mins… gotta go do something.

The raw data is by (free) subscription from their data server. If I didn’t put in the link above (sorry, no time even to look right now), you can find it with google. They don’t allow me to post the raw data (as far as I can tell from a quick look at their user agreement) so you have to go get it. Otherwise, I’d upload it (in a form a little easier for excel than what they provide – had to write a script to put each month on a separate line).

That’s a straw man argument. Anyone who claims altitude adjustments always work is as far out of touch as someone who denies the 1.2C/CO2-doubling value from the 1-D radiative balance model.”

It wasn’t an argument, John. It was another example in the area.

“Of course altitude adjustments don’t always work. We don’t even need to go looking for it – we know that UHI is but one of many factors that can cause errors.”

Take for instance a rural location early 1900s compared to an airport in the late 1900s. Airport could be influenced more by UHI than Kelburn was, or Kelburn influenced recently more than was Thorndon.

“The argument is that altitude adjustments are the most reasonable thing to do with two stations of different altitudes, IN THE ABSENCE OF OTHER INFORMATION. Our discussion on Thorndon/Kelburn fals into that category. That argument also means that the use of an altitude adjustment in that circumstance is not an argument for malfeasance or even sloppiness.”

That would be in the absence of ADEQUATE information. Sticking your hand into one of many boxs that may or may not contain a rattler is not a “reasonable” decision, if you have no other information which provides you with a reasonable assumption that there is no rattler in that particular box. I do not agree that this arbitrary use of lapse rate is not a good candidate for accusations of scientific misconduct, subject to specific methodologies and information that NIWA has decided for one reason or the other not to make public and available. In the absence of that, it is entirely reasonable to suspect that these adjustments were nothing more than ad hoc, and that other considerations that are reasonable could have been applied to the various stations which would have produced a different overall trend.
Adjusting by this arbitrary amount without explanation invites and legitimizes this scorn, especially in light of the fact that nearby stations that do not obey IS “other information”.

John: “I thiink we are still talking past each other somehow, but I don’t know what the missing link is.”

I think we can agree on that. Our main point of contention seems to be the validity of adjusting for altitude, and it perplexes me that you regard the data to be sloppy, but the decision to adjust for altitude based on no other information as reasonable.
I can’t set “sloppy” and “reasonable” together and make it look like a good meal.

But the only thing relevant to the gap issue is that the adjustment created it, not time or temperature, before and/or after. Were the adjusted graph just a record of one station, there would be no gap. The endpoint of Thorndon and the startpoint of Kelburn would overlap, right on top of one another.
They do not, not because of a real temperature difference, but because of the altitude adjustment itself.
This gap of a year on the NIWA graph is what interests me most about that portion of the graph. Why is it there? An artifact of a mapping program? What someone thought the uninitiated public would be able to see, to separate two station records artificially?
I don’t have reason to mistrust your graphs and figures. But I would like to see for myself the actual data coming out of the database, and how you succeed where I have failed to do so with regard to Thorndon and the first three years of Kelburn.

Perhaps we can share how we set the database up to search? What gets me is that I get Kelburn temp data from 1930 to present. Perhaps you are using a database type I have not tried? I pull maxmintemp and rainfall.

Oh, separating data in Excel is easy, just highlight a few rows from the first column of downloaded data, go to “Data” menu and select “Text to Columns” (next, next, finish). Then you can pick and choose data in their own columns, instead of pulling it out of merged data.

Glenn writes:
“The NIWA graph is good enough to see what the max adjusted annual difference is in the record between Kelburn and Airport, as I already mentioned, about .4C around 1970, one time in 45 years. That is what I meant by “virtually” no two averaged years of the magnitude of .6C difference. I was aware that the graph was plotted with yearly averages, and apologize for trying to hammer my point down by stressing how many months are included in over 40 years. But my point still stands.”

Hmmm… okay, we were talking about different things. I thought you were talking about one year differences between data in the same series. Now I see you are comparing two series – Kelburn and Airport.. But I apologize for my denseness, but I still don’t know what you mean. I look at Well Aero vs Kelburn (I don’t have the other aero stations) and see, for example, a .7C delta for 1995 (13.5 vs 12.8), and it looks like there are others. I would have to do some more excel fiddling (or better, change my script) to make it easy to compare stations on a month by month or year by year basis.

“I’m surprised to find you managed to find the raw temp data from a source on the Internet of Thorndon and 1928, 1929 and 1930 Kelburn stations. Perhaps I wasn’t registered with the adequate “subscription level” (argent) for the NIWA database, although I can access temp data for many stations, including Kelburn after 1930.”

“How were you able to actually download this data?”

I don’t have the raw data. I think this is “corrected” data (i.e. for a single station, changes at that station). They also have what appears to be raw daily and hourly data, but without knowing the station history, I can’t correct it myself, so I didn’t use it.

In other words, the data I am using, which is the same data that resulted in contention about the graph (the .6C correction), is already fiddled by NIWA, and I have to simply trust those adjustments (but the scope of the discussion was the height adjustment, not the station history corrections).

I simply asked for data for Thorndon and it gave me a series ending in 1927, and for Kelburn, a single series from 1928 to 2004. Below is what it the metadata at the top of the data (csv form). I assume Kelburn AWS is Kelburn with an automated station, whereas Kelburn (no suffix) is the manual station.

,Statistics codes in this query are:,,,,,,,,
,Code,Description,Units,,,,,,
,2,Mean Air Temperature,Celsius,,,,,,
,3,Mean Daily Maximum Air Temperature,Celsius,,,,,,
,4,Mean Daily Minimum Air Temperature,Celsius,,,,,,
,65,Mean Of 9am Temperature,Celsius,,,,,,,,,,,,,,,
,Note: Statistics calculations are based on Local Time.,,,,,,,,,,,,,,,,,
,Monthly extremes are recorded on the Local-Time day of the month.,,,,,,,,,,,,,,
,,,
,Annual extremes are recorded in the Local-Time month of the year.,,,,,,,,,,,,,,
,,,”

” Our main point of contention seems to be the validity of adjusting for altitude, and it perplexes me that you regard the data to be sloppy, but the decision to adjust for altitude based on no other information as reasonable.
I can’t set “sloppy” and “reasonable” together and make it look like a good meal.”

Okay, let’s use the word “noisy” rather than sloppy, because I think that’s more accurate.

If all you have is noisy data, you don’t have any choice but to use it if you want to do science about it. That’s a normal problem faced by scientists in many, many disciplines. They have noisy data, but they are still able to draw valuable insight from it. In fact, I’d say that noisy data is the norm, not the exception.

Now to using an altitude offset… as I have said before, if that’s all you’ve got, it may be better than nothing. But there are caveats associated with that statement. If you only have one case, it might be just plain wrong in that case. So while it may in general be correct, in specific it may be wrong. That means a couple of things:

1) you want to use averages – use a bunch of stations, and those errors will (hopefully) average out

2) you need to set your level of uncertainty appropriately. This is where I think the AGW alarmist paleoclimatologists are out to lunch. I’m sure they believe their conclusions, but it appears that they are way, way too assured of their results. That’s a pretty normal thing also – see the following aside.

——-

As an aside, if you aren’t familiar with Feynman’s exposition on the errors in the Milliken oil drop experiment, I strongly recommend it (of course, I recommend almost everything Feynman wrote, having been fortunate enough to attend some of his lectures at Hughes Malibu Research Center).:See http://www.lhup.edu/~DSIMANEK/cargocul.htm and search into it for “Millikan” if you don’t want to read the rest of this wonderful exposition.

John: “I simply asked for data for Thorndon and it gave me a series ending in 1927, and for Kelburn, a single series from 1928 to 2004. Below is what it the metadata at the top of the data (csv form).”

John: “If you only have one case, it might be just plain wrong in that case. So while it may in general be correct, in specific it may be wrong.”

I do not disagree, of course the reasonableness of averaging depends on the purpose of the application.
But doesn’t this contradict your claim in this Wellington case as being a reasonable technique? How is a level of uncertainty set for a single station with data that may be way off to begin wit, and can’t be compared with data from any other station?
You gotta have more than a simple claim that it is reasonable to adjust Thorndon by .8C. A 1.2C or .4C may be more accurate, but how in the dickens do we have any confidence in any??

What format are you using for download? I don’t recognize the format you posted.”
Yes to the site. I believe I asked for comma separated data (csv) for Excel. Then I ran it through a script that removed the extraneous comma in three of the station names (so the columns would line up). The download got data with one year per line, with the months spread across the line and the annual average at the end. My script produced two fies:

1) only annual average
2) monthly data, one month per line (this is why, on one of my graphs, the X axis looks weird – I used year*12+month-index as the value – so Excel could grok it easily.

“But doesn’t this contradict your claim in this Wellington case as being a reasonable technique? How is a level of uncertainty set for a single station with data that may be way off to begin wit, and can’t be compared with data from any other station?
You gotta have more than a simple claim that it is reasonable to adjust Thorndon by .8C. A 1.2C or .4C may be more accurate, but how in the dickens do we have any confidence in any??”

If your purpose is simply to construct a Wellington time series, you can’t have much confidence in the adjustment.

If your purpose is to construct a time series for Wellington, that will then be used along with a whole bunch of other stations to get an average temperature time series of an area (say, all of New Zealand), your confidence would be higher. How high depends on all sorts of stuff, and I don’t know the answer to that for New Zealand, as an example, or some averaging grid that Wellington happens to fall into.

One question that has to come up is: why try to construct individual station time series at all, rather than using the areal averages to start with. My guess is that by constructing time series they can at least visually look for outliers or bugs in the method. It also gives them something to show the public (I’ll be they regret that now). Finally, it is how everyone tends to think about things, so they probably did it out of habit. And, perhaps they did it to alarm the public, in which case shame on them, but we don’t have evidence or even reasons for strong suspicion of that in this particular case.

One of the criticisms of the whole approach of reconstructing global temperature from surface temperature stations is the geographic sparsity of stations used in computing area averages – especially in the southern hemisphere. Others have noted that a single station in some areas can have an extraordinarily high impact on the overall global series, using the methods of CRU, and that’s a real big problem.

Statistics codes in this query are:
Code,Description,Units
02,Mean Air Temperature,Celsius
03,Mean Daily Maximum Air Temperature,Celsius
06,Extreme Maximum Air Temperature,Celsius
43,Lowest Maximum Air Temperature,Celsius
61,Standard Deviation Of Daily Mean Temperature.,Celsius
62,Lowest Daily Mean Temperature,Celsius
63,Highest Daily Mean Temperature,Celsius
65,Mean Of 9am Temperature,Celsius
76,Days With Maximum Air Temperature > 25c,Day
Note: Statistics calculations are based on Local Time.
Monthly extremes are recorded on the Local-Time day of the month.
Annual extremes are recorded in the Local-Time month of the year.

“That’s what the metadata says – see my post above. Furthermore, the data matches the graph they posted in their explanation.”

Thanks for the help. I finally got it, before I only searched for daily temps, and when I realized changing the datatype for monthly’s it worked.

I had no intention of questioning the data on the graph, although it may be worthwhile to try to determine what data may be pre-adjusted. But that’s another matter, my interest was to verify that data was actually available, and if there was a gap. No gap.

“If your purpose is to construct a time series for Wellington, that will then be used along with a whole bunch of other stations to get an average temperature time series of an area (say, all of New Zealand), your confidence would be higher.”

Not mine, and I don’t think that’s good science. Were a value known for the average altitude lapse rate within the coverage area of the associated station locations, I would. Then again, as I have said before, if that were known, only one station would be required for a large area, and a whole lot of data on specific lapse rates at different locations in the area. I just don’t believe in the blind application of an “average” lapse rate, without any further information specific to each location to be adjusted. Now it may be that the person responsible did compare variables such as rainfall, sunshine, humidity from the early 1900s to the late 1900s. That I could understand and have some confidence in.

Well, we will have to disagree on whether the adjustment is or is not good science. I contend that it is not necessarily bad science, and that even without additional information, on average, such adjustments improve the information.

“Well, we will have to disagree on whether the adjustment is or is not good science. I contend that it is not necessarily bad science, and that even without additional information, on average, such adjustments improve the information.”

I think you are right that we’ll have to let that “general rule” go. But then you seem to be less confident in applying such an adjustment to just one record, as is the case we are discussing here. So we may still have fuel for further discussion on such things as UHI effects on the complete Kelburn record, as well as at the airport station(s), and the confidence level of the altitude adjustment for the airport and Thorndon stations. (don’t know why NIWA wanted to bring the record up to the higher elevation, but I suppose that was to bring all the other stations in sync – another matter).

John, perhaps you could put a confidence level on the Thorndon adjustment? 8-)

Karori is 27 meters higher and 1.3 km from Kelburn. The two stations overlap in coverage from 1974 thru 1979, and is averages 0.98C lower than Kelburn for that period (12.58 – 11.6).

Karori is 8 km from the Airport, Kelburn is 6.3 km from the Airport.

Standard lapse rate for 27 meters is 0.18C. Karori is FAR cooler than a standard lapse rate from Kelburn to Karori, by 0.71C!

Would it be reasonable to adjust Kelburn as well as the Airport in that time period to compensate for a non-standard lapse rate of something more than .79C, and then to sort out the differences in absolute temps?

Or does Airport just HAVE to be dropped by a certain amount so that the absolute temps match as well as possible?

Or do any effects on temp such as UHI, local conditions, etc need to be considered before altitude adjustments are made? (Karori would)

Too bad there are no overlapping stations in the nearby hills to compare with Thorndon in the early 1900s.

It is one thing to prove that the earth is warming and another to prove that CO2 is causing run away green house effect.
Did anyone care to discount the effect (apart from geological influence already discussed by other readers above) of increase in the number of cars on the road, in the number of airconditioners and in population since the 1920s? While the scientists adjusted upwards shouldn’t they also adjust downwards to compensate for the said increase that has nothing to do with green house effect? This power consumption would inevitably produce heat as waste and release CO2 into the environment but the warmists are clouding the issue here and equate this waste heat as green house warming due to CO2.
Most of the historical stations has been more and more affected by heat island as a result of urbanisation. Until full detail on the stations and the pruning method can be made public and reviewed and discussed and debated, there is very little in way of credibility to the processed data used by the so call main stream climate scientists.

“Would it be reasonable to adjust Kelburn as well as the Airport in that time period to compensate for a non-standard lapse rate of something more than .79C, and then to sort out the differences in absolute temps?”

See below

“Or does Airport just HAVE to be dropped by a certain amount so that the absolute temps match as well as possible?”

No way. The drop should be based on best evidence.

“Or do any effects on temp such as UHI, local conditions, etc need to be considered before altitude adjustments are made? (Karori would)”

They should be part of the adjustment – see below.

“Too bad there are no overlapping stations in the nearby hills to compare with Thorndon in the early 1900s.”

Yes, it is.

…..as to how to do it better……

One thing to do would be to put stations in historical locations (like Thorndon) no longer present and record a few years of calibration data. That would help anywhere you didn’t have enough overlap.

Another would be to try to create a meteorological model of each station, and use it to “forecast” the past in some sense. Then one could use it to forecast an overlap period, and use that to help calibrate the differences. By “model” I mean something that accounts for things like elevation, exposure to various common weather conditions (katabatic, atabatic, trade winds, estimated cloud cover, etc). For stations near the ocean, in mountainous terrain, this might help a lot.

Another would be to try to measure average lapse rate for similar station differences in the same general area, and crank that in.

One could go on…

NIWA obviously didn’t try very many of these. If I were them, I’d use the adverse publicity to get funding to do more of this sort of thing, while at the same time making everything transparent.

If we are talking about lots of stations, then I agree that using deltas is fine (and probably more reasonable), for aggregates, than trying to construct continuous station time series – ASSUMING that nothing changes in any series over which you take the aggregate – such as time of observation, etc.

Exactly. That’s why I mentioned the TOB.

What it means is that if something changes, you have to treat it as a new series.

Now you need to have a strategy that copes with

1. 1 or more series
2. What to do if there is an overlap
3. What do do if there is no overlap.

Nick – Agreed, although again you can tie the series together what an adjustment – if you can do a good job of that. That’s what they are attempting on all these station series – like Darwin which has been much discussed.

I sort-of agree that you don’t need to stitch them together, but there is a problem there. Since the data is noisy, a trend over a short period doesn’t tell you much. So if you take all these pieces, take a trend on each, and average, I’m not sure how good that would be. After all, the alleged AGW trend is very small compared to the intra-annual variation, so only long term averages bring it out of the noise.

I’m probably not going to have time to do much more fiddling. It would be interesting if some of the statisticians who read this blog, or some of the posters, would comment on how one would calculate trends without doing this seemingly unnecessary local trend generation.

It occurred to me that two stations do run concurrently for a long period of time, and that there may be some information that could be gathered from a comparison of the two.

Kelburn data is available from 1962 through 2008 (complemented by Kelburn AWS for the last four years), and Wellington Aero (Airport) data exists from 1962 through 2008.

The average temperature difference between these locations from 1962 through 2008 is 0.6C. NIWA adjusted Thorndon the same amount as the “Airport” location, by .8C.

There is no trend in the stations differences between 1962 and 2008, although recently, from 1995 through 2008 the difference has been .9C. As an aside, from 1995 through 2008, Kelburn has a .2C decreasing linear trend, while Aero has a .2C increasing trend. Both trend downwards from 2000 through 2008.

Analyzing a comparison chart of the two station records, the long term average can be clearly seen to be affected the most by times of steep increases or decreases in temperature, which brings the stations temps closer together and affects the long term average differential.

A graph of the Thorndon temps is a good example of steep increases and decreases. If differences between stations are an indication of lapse rate, using the best estimate of a real, long term local average lapse rate would necessitate adjusting Thorndon down by .6C, not .8C.

It is entirely possible and IMO likely that the Wellington area has not experienced an overall increasing trend of more than a couple tenths of a degree in a century. Time of measurement errors likely prevail throughout earlier records, and as more attention was paid to climate change in the mid 70s, more effort and more records were likely done and kept which could easily cause an increasing divergence between the old and newer records. Local land use changes could have an effect, and could station UHI, for example with the two airport stations that combine in the NIWA record, the station that NIWA combines with “AERO” being an average of .2C higher than the other operating station which is not used by NIWA in their graph.
If all this is taken into consideration along with an assumption that Kelburn itself after 2000 read a couple tenths of a degree higher due to siting or UHI problems, Wellington trend is flat between 1913 – 2009.

“One thing to do would be to put stations in historical locations (like Thorndon) no longer present and record a few years of calibration data. That would help anywhere you didn’t have enough overlap.”

Time travel hasn’t been developed to the degree necessary. (Sorry, couldn’t resist)
That would not provide you with overlap. Lapse rates change with the weather. You’d be calibrating Thorndon with 2008 weather.

“Another would be to try to measure average lapse rate for similar station differences in the same general area, and crank that in.”

Yes, but we don’t have the average lapse rate for Thorndon, nor any stations that overlap.

“nother would be to try to create a meteorological model of each station, and use it to “forecast” the past in some sense.”

Perhaps, but the old records are scarce enough, there is not much reason to expect that in the late 1800s and early 1900s detailed enough records would have been kept on daily basis of such variables you mention. And even now models have a hard time forecasting precise temperatures to tenths of a degree. It’s one reason we use thermometers.

“NIWA obviously didn’t try very many of these. If I were them, I’d use the adverse publicity to get funding to do more of this sort of thing, while at the same time making everything transparent.”

That isn’t obvious. The guy who started this survey has published work. NIWA claims to have a pamphlet in their library of the methodology used as well. I’ve read a little, about studying old sea charts and wind patterns, pictures, tea leaves and such.
But when it comes right down to it, a massive amount of data and analysis could have been attempted; how much was actually considered is a different matter, and whether there is information anywhere that could support a serious scientific study may be elusive or nonexistent. It could have been not much more than “ok, do the lapse rate, no stinkin uhi, I got to hurry off, busy with a bunch of different things, thats enough to get through review, here ya go” kind of thing. I have very little confidence that isn’t closer to the truth.

“RE: your 20:00:54 comment… Not sure what you mean. What data did u use? Annual averages or monthly? I tried it with annual and did see a bit of a trend.”

I used annual from monthly averages. What trend do you mean? For the ’62 to ’08 stations difference? If so, I wouldn’t expect a trend, but a pattern indicating weather shifts. Oh, I made a mistake, it looks like the difference is greater, not less, when there are dramatic ups and downs in temperature. Don’t have a clue as to what could be the reason behind that.

“That would not provide you with overlap. Lapse rates change with the weather.
You’d be calibrating Thorndon with 2008 weather. ”
Better than nothing, not as good as having had the overlap in the first place.”

Nothing is not better than nothing. Might as well read tea leaves. You can do that now, though, just as I did with the airport and Kelburn, and lower Thorndon by .9C, or use the Karori station and adjust Thorndon by 75.1C. At least a long term record of two stations like I did is something, as weather comes and goes but a long average has a better chance of hitting on a more accurate average.

“On the met models, I didn’t mean the GCM’s used in modern forecasting. I meant simple stuff to take into account local conditions, based on climatological data like wind patters, cloud patterns, etc.”

It’s far from being simple, and it changes. I wouldn’t want to be the one that spent years developing algorithms and programs just to find out after firing it up that it didn’t predict temperatures within a degree in a week.

I suspect your surmise at the end of the previous comment is closest to the truth.

The adjustment of exactly the standard lapse rate makes me the most suspicious.

“RE: your 20:00:54 comment… Not sure what you mean. What data did u use? Annual averages or monthly? I tried it with annual and did see a bit of a trend.”

If you meant the part about a trend being no more than 0.2C or flat,

The first 14 years (Thorndon) adjusted down 0.6C averages 12.6C.
The last 14 years (Kelburn) adjusted down 0.2C averages 12.8C.
So there is an average increase from those periods of 0.2C.

The linear trends for Thorndon first 14 years and Kelburn last 14 years are both descending. This, if you trust “trends”, indicates that Thorndon was cooling and Kelburn is cooling, which puts any real increasing trend in doubt, which is why I said Wellington is flat.

Linear trends can be deceptive, and can be more so for a long period than a short. If all data is combined and plotted from 1913 to 2008, a linear trend = 0.6C.
But a little test shows the problem with linear trends such as this.

Take a 20 year sample of temps shown below, the linear trend increases 0.2C, even though temps end up cooler than the beginning:

In the temperature series above ( Glenn (14:07:12)) we don’t know what the temp before the beginning year was. The temperature lowered, then rose and held to just under the value of the first years. A linear trend of this series shows an increase of about 0.2 degrees.

If the series is extended a hundred years to 2020, all with 11 degrees, the linear trend STILL increases by 0.2 degrees.