Data Abuse

Global warming is really happening. It’s because of us. And it’s dangerous.

But some people can’t accept that, so they deny the truth of it. Perhaps the most popular website which does so consistently is “What’s Up With That” (WUWT), a cornucopia of articles trying to show “evidence” that the climate scientists — the ones who are warning us — have got it all wrong.

To make their claims, they often use climate data. Most climate data make it extremely difficult to deny the reality of the problem, so a recurrent theme is to imply — or say outright — that the data we use are wrong because they’ve been adjusted. It’s implied — or said outright — that the adjustments are arbitrary, that their only purpose is to give the impression of a global warming trend, and they can’t be relied upon.

It may seem unreasonable to refer to the official data as “adjusted”. However, the basis for the official data is what is known as the Global Historical Climatology Network, or GHCN, and it has been arbitrarily adjusted. For example, it is possible to compare the raw data for Cape Town, 1880-2011, to the adjustments made to the data in developing GHCN series Ver. 3:

The Goddard Institute for Space Studies is responsible for the GHCN. The Institute was approached for the metadata underlying the adjustments. They provided a single line of data, giving the station’s geographical co-ordinates and height above mean sea-level, and a short string of meaningless data including the word “COOL”. The basis for the adjustments is therefore unknown, but the fact that about 40 successive years of data were “adjusted” by exactly 1.10 degrees C strongly suggests fingers rather than algorithms were involved.

There has been so much tampering with the “official” records of global warming that they have no credibility at all. That is not to say that the Earth has not warmed over the last couple of centuries. Glaciers have retreated, snow-lines risen. There has been warming, but we do not know by how much

Setting aside who’s responsible for the GHCN, let’s look more closely at the “adjustment” of data for Capetown, South Africa. Here are annual averages of unadjusted data:

Does anything look odd to you? It looks to me like a sudden shift — one which contradicts how long-term temperature behaves — occurred between 1960 and 1961.

We can see more by looking at monthly data. We’ll be swamped by the seasonal cycle, unless we remove it, so we’ll take each month’s value and subtract the average for that same month to generate anomaly values. We get this:

Not its’ even more clear that a sudden shift occurred, one which is not physical. It probably represents a non-climate influence, like moving the station to a different location.

That’s why the data were adjusted. Because if you want to study climate, it’s necessary.

After adjustment, the monthly anomalies look like this:

With the discontinuity corrected for, we have a valid picture of how the climate has changed at this location (at least as far as temperature is concerned). The adjustment isn’t “arbitrary,” the basis for the adjustment is known, and it didn’t involve using “fingers,” it used an algorithm.

Even when the folks at WUWT don’t raise a fuss about data being adjusted, they still manage to abuse it. Willis Eschenbach has a post at WUWT which looks at sea ice, using the data from HadISST (the Hadley Centre/Met Office Ice and Sea Surface Temperature data set). He isolates the data after 1974, because that gives him this:

That enables him to claim that global sea ice area doesn’t show any overall trend, rising or falling, since 1974.

It looks like the data Eschenbach is using shows a sudden shift about 2009, one which needs to be corrected for but hasn’t been in the HadISST data set. If you go looking, you’ll find that the Met Office itself reports thus:

08/MARCH/2011. The switch of satellite source data at the start of 2009 introduced a discontinuity in the fields of sea ice in both the Arctic and Antarctic.

When one post at WUWT doesn’t like the use of data that have been corrected for a non-physical shift, that’s a problem — so it’s time for them to imply that the adjustment is “arbitrary” and based on “fingers” and “have no credibility at all.” But when another post likes what data show because they contradict common sense due to the lack of an adjustment that’s needed — no problem at all.

You state that, on the adjusted data example, that it “probably represents a non-climate influence, like moving the station to a different location.” THAT is not an argument or an explanation. To effectively respond to the claims that data has been arbitrarily adjusted, you need to provide an actual, verifiable, quantifiable reason for that adjustment.

[Response: I did. I showed the actual, verifiable, quantifiable non-physical discontinuity in the data. It’s obvious. Do I really have to do the changepoint analysis to convince you?]

Actually it’s the other way round – in order to insinuate it could have been anything different than a non-climate related reason, you need to show us the actual, verifiable sources – that inevitably would need to exist – that report some sort of catastrophic events that abruptly changed climate in Cape Town around 1960…

Tamino – you don’t have to convince me. You DO, however, have to convince those who’ll argue that you’ve ‘doctored’ the data for your own ‘nefarious’ purposes. If you can’t identify why the data set ‘dropped’ at the point of your adjustment you’ve lost the argument. You’re forgetting that the ‘audience’ for climate change is NOT a group of data analysts but the general public who’ll rightly expect a genuine and instance-specific explanation for the flaw in the unadjusted data. Generalizations such as it’s PROBABLY caused by “a non-climate influence, like moving the station to a different location” will only convince those who worry about data manipulation (in general) that something ‘fishy’ is going on.
That’s, of course , only relevant if you’re in the business of developing the public’s confidence in the process. If you’re only in it for a circle jerk,well, that’s already been achieved.

Random, thanks for the reply. I disagree with that comment. You have to put this in the context of gaining public support and acceptance of the facts. Data is data. If you are altering that data, you need to be able demonstrate a valid reason as to why that alteration is necessary -‘ because it doesn’t fit the pattern’ is not a justification. Remember, we live in a time when there is enormous distrust of ‘the establishment’ and our public (and private) institutions. If you were to say, however, that records indicate that the collection site was moved in such and such a year, or equipment was replaced etc and this coincides with the discontinuity, the argument is made. You aren’t going to convince a largely innumerate population that everything is above board without that.

Andy, if you need to convince people you are acting in good faith, you won’t succeed. The people who think you aren’t acting in good faith on the Cape Town data could find out that the adjustment was absolutely 100% justified, but they would still carry on about every other adjustment as though it hadn’t been justified.
If you are like me, you just notice that whenever an adjustment has been examined by a person with the competence to do so, then it has been justified, and you draw the logical conclusion that you can accept the rest as likely to be correct.
The people you are trying to convince (deniers) are too lazy to look at anything. They behave in a strangely non-Bayesian way. When faced with the latest attempted denier deception, what they should do is say, “What is the chance of this being correct, given that the last 40 blog posts like this turned out to be wrong?”. But somehow they exist in a world where every new piece of evidence is examined in splendid isolation.

Andy, we’re in disagreement here. You can’t convince somebody who *wants* to believe there’s something wrong with the data.

And I disagree with you that – apart from those – there are actually that many people who really wouldn’t instinctively recognize that, no, Cape Town actually didn’t fall into a singularity of nosediving temperatures around 1960.

And I do criticize your approach that makes it too easy for the ‘hoaxers’ to weasel away from their insinuations. I’m all for calling them out for their purported reasons how and why this could and would be a climate-related thing (so that their ‘doctoring with the data’ claim could stand).

Andy,
Why is there always this very strong double standard. A denier with a degree who should know better at WUWT produced a graph without any support claiming that the data had been ufairly manipulated. His graph is deliberately deceptive. There is so much crap on his graph that it is not possible to see the obvious change in 1960. He then claims that data has been unfairly manipulated. Meanwhile you say that scientists have to provide proof that there is not a problem.

Tamino starts off with a clear graph that is obviously affected in 1960. He does not even need to do any statistics to see when the break point is. After a few posts here, someone on the internet has provided the metadata that the station was moved in 1960. This explains all the problem. If the data had been clearly displayed in the first place, everyone at WUWT would have seen the problem without the slander.

This meta data is readily available to the original poster at WUWT, he simply contacted the wrong person to get the data. The issue is he is deliberately misleading the public. The call should be to take action against those who deliberately mislead the public, not for scientists to spend their valuable time chasing down idiots on the web. How many times does Tamino have to show that WUWT is unreliable before they are dismissed from the converstaion?
\
Why do scientist have to prove that WUWT is wrong time after time? It is much more time consuming to do it correctly than to make unfounded accusations.

Station relocations are very common (Victor Venema has written quite a bit about this), and not always documented. Anyone wanting to make a data quality argument based on unadjusted series really ought to pick a station and period documented to be consistent.

Anyway, Cape Town is pretty simple: the current data come from the Cape Town International Airport, built in 1954, with the temperature series starting there in 1960. It’s a particularly poor choice for Lloyd’s argument since it has a documented relocation. Lloyd really ought to know that, as he was a college student in Cape Town when the airport was built, and every WUWT regular surely knows that relocations to airports are common.

Tamino – you don’t have to convince me. You DO, however, have to convince those who’ll argue that you’ve ‘doctored’ the data for your own ‘nefarious’ purposes. If you can’t identify why the data set ‘dropped’ at the point of your adjustment you’ve lost the argument. You’re forgetting that the ‘audience’ for climate change is NOT a group of data analysts but the general public who’ll rightly expect a genuine and instance-specific explanation for the flaw in the unadjusted data.

Actually, I know for a fact that scientists have been manipulating temperature to *hide* the warming trend, not exaggerate it. And I can prove it by performing some inverse-WUWT cherry-picking.

The data for all of the stations listed below have been adjusted to reduce or even eliminate long-term warming trends by warming the past and/or cooling the present. (With the assistance of an interactive app, it took me just a few minutes to find them all).

St. Helena Is
Lat -16
Lon -5.7
WMO ID 61901000

LUANDA
Lat -8.85
Lon 13.23
WMO ID 66160000

LIHUE, KAUAI,
Lat 21.98
Lon -159.35
WMO ID 91165000

HONOLULU
Lat 21.35
Lon -157.93
WMO ID 91182000

SAO PAULO
Lat -23.5
Lon -46.62
WMO ID 83781000

RIO DE JANEIR
Lat -22.92
Lon -43.17
WMO ID 83743000

COIMBRA
Lat 40.2
Lon -8.42
WMO ID 8549000

KAGOSHIMA
Lat 31.55
Lon 130.55
WMO ID 47827000

TOKYO
Lat 35.68
Lon 139.77
WMO ID 47662000

PETROPAVLOVSK
Lat 52.98
Lon 158.65
WMO ID 32583000

CWC VISHAKHAP
Lat 17.7
Lon 83.3
WMO ID 43149000

QUETTA CANTONMENT PAKISTAN
Lat 30.2
Lon 67
WMO ID 41661001

Now, I (as a member of the general public) demand that tamino and all of his fancy-pants scientist friends prove to my satisfaction that scientists didn’t manipulate that data to hide the global-warming trend. And I want to be convinced without having to make any effort on my own to understand anything about temperature data homogenization and averaging techniques. I don’t want to do any of my own homework; I want tamino and his friends to serve everything up to me in a sippy-cup.

If tamino can’t identify exactly why the data collected from the above stations was artificially cooled, at the point of the adjustments, then as far as I’m concerned, he’s lost the argument.

Now, given all the temperature stations I listed that show data-tampering to reduce or eliminate warming, how could tamino possibly convince someone like me (i.e. a member of the “general public” who isn’t about to mess with any of that fancy book-learnin’ stuff) that scientists aren’t actually trying to hide the global-warming trend?

Philip Lloyd: The Goddard Institute for Space Studies is responsible for the GHCN.

So much for competence, the GHCN was developed at the Oak Ridge National Laboratory and the National Climatic Data Center. It is being maintained by the NCDC. He should have asked NCDC for the metadata.

I don’t think you will ever persuade the readers of WUWT and the contributors to it that they have it wrong, they just cant see it as anything other that conspiracy or incompetence. It is a shame really that climate change requires a vote at all at the election and that the facts from experts and the scientists is not enough (seemingly) to get something done about it. However I would say that the Paris agreement demonstrates that science has won out to some degree and that plans are coming together to tackle the issue (maybe not 1.5 or 2C) but far better than the 5-6C we are heading for.

I think what Andy was saying is, in addition to the (obvious) statistical rationale for the adjustment, it would be helpful to also state the physical, real-word reason for the discontinuity. Was the station moved? Did they change instruments or protocols, etc.

It would be nice, but AFAIK it’s not generally readily available information–there are thousands of GHCN stations, and reporting has to go through several layers of bureaucracy before it even gets to GHCN, as stations belong to their national weather services. And consider the Cape Town case–anyone old enough to remember the station history is probably retired, which means that the information would have to be prised from an archive somewhere–and that’s the best case, in which the records are still intact and accessibly archived.

That said, I’m very skeptical of the claim that the metadata file is actually “meaningless”–cryptic, sure, but does the WUWT writer really think that someone typed in some random keystrokes just for the hell of it? (I know, I probably should have terminated that sentence with the verb “think.”) I wouldn’t be at all surprised to find out that the metadata key could be had for the asking, nor indeed that it’s already out there in a readme file.

Cryptic, yes, but its meaning is fully explained in the GHCN readme. Anyone who’s ever played around with GHCN data should have come across stuff like this before, even if they never needed to decipher it in full.

Thanks. Very interesting to poke through. For the curious but time-challenging, I’ll just say that the codes reveal that the station is coastal and urban, located at an airport at an elevation of 12 meters about 1 kilometer from the ocean, in a grid with an estimated elevation of 65 meters, vegetation of the “cool forest” type, hilly terrain, and that the population of the surrounding urban area was 834,000.

There seems to be a bit of an anomaly: if Henk, below, is right, then this would be the Royal Observatory station, described here:

But there’s no airport there, which suggests that the station was moved (perhaps to DF Malan airport?) without the elevation being updated in the metadata file. That, hypothetically, would be a shift from 12 meters to 44 meters elevation.

Andy – can you identify any stations that have a break of similar amplitude and duration to Cape Town, where site parameters (location, instrumentation, TOB etc) are known be stable? A few examples of this would buttress your skepticism.

Barry, thanks for the reply. No- not a clue. That would require someone with a lot more knowledge of the system of recording and collection than me. I just subscribe to this site and comment when an article interests me. As a former skeptic, my comments are coming from the position of one who needed to be convinced and, as such, I tend to have a look at what’s needed to convince others. On this issue of data adjustment vs manipulation, it’s going to be a very hard sell when those justifying the adjustments can’t point to a ‘real-world’ reason for its necessity. I acknowledge the difficulties of that as other respondents have pointed out. That doesn’t change the reality of what’s needed to have people buy into and accept the process.
CHeers

We all know how untrustworthy the verbage pasted up on Wattsupia can be, or perhaps more accurately how untrustworthy it invariably proves to be. And that Cape Town GHCN data station has not gone without mention in the past (eg within this pile of gobshite). Mind, the dumb-ass who tapped out the drivel we examine in this post is from that neck of the woods so he may feel it is particularly important and so can be (mis)used to dismiss the entirety of the instrument-based global temperature records.
But we may not be so familiar with that particular dumb-ass drivel-tapper, Professor (Hon) Philip Lloyd of Cape Peninsula University of Technology.
I was quite impressed by the level of incompetence exhibited within a paper written by this Phillip J. Lloyd last year. In it he takes the ice core data from the poles over the Holocene (GISP, GISP-2, Vostok, Dome C) to obtain four temperature series comprising data with ~100 year period. He de-trends them by various means to leave residuals and proclaims the standard deviation of the residuals to be a measure for the actual global warming (or cooling) over a 100 year period that can be expected from natural variation.“During the 20th century, thermometers recorded an increase of about 0.7ºC. It seems reasonably certain that there was some warming due to the increasing buildup of greenhouse gases in the atmosphere, but it seems difficult to estimate the magnitude of this warming in the face of a likely natural variation of the order of 1ºC. The signal of anthropogenic global warming may not yet have emerged from the natural background.”
(His latest drivel of course denies the ” increase of about 0.7ºC” is real.)

Station moves are a *big* reason adjustments are needed — and it’s pretty easy for someone with programming experience to show that at a minimum, hundreds of stations in the GHCN temperature data-set must have been moved.

If you drill down into the GHCN data/metadata, you will find on the order of 400 “airport” stations that have temperature records going back before airports even existed. It is quite obvious that those stations had to have been moved at some point in their history.

If you compute global-average temperature estimates from the raw GHCN data (using all stations) and plot your own results against the NASA “meteorological stations” results (NASA uses adjusted GHCN data), you will see a visible bias between your results and the NASA results. It’s not that big, but it is visible.

Then if you repeat your own raw data computations, this time *excluding* the “airport” stations, you will see a visible reduction in the bias between your results and the NASA met-stations results.

Of course, this exercise does not account for all of the “moved” stations, and you almost certainly throw out a lot of stations that *haven’t* been moved. But it’s an easy “first cut” analysis exercise to try, and it does produce useful results.

It turns out that running GHCN temperature data through even a very simplistic gridding/averaging program will produce results that match the NASA “meteorological stations” results amazingly closely. This really is a case where less than 1 percent of the effort will get you 98 percent of the answer. Here’s how results I got when I ran adjusted GHCN data through a very simple baselining/gridding/averaging program stack up against the official NASA results (computed from the same data): https://drive.google.com/drive/my-drive (my results in blue, NASA’s in red).

Just a quick note on that — I didn’t get a single positive response from any of the UT San Diego forum “skeptics” to temperature results I posted there. Not one. In fact, the resident forum “skeptics” responded by claiming that I got the results I did because NOAA and NASA had manipulated the temperature data — even the raw data.

At this point in time, I think it’s pretty safe to conclude people who are still climate-science “skeptics” will never be convinced, no matter what information tamino and other scientists provide them.

Here’s a suiggestion for you- since you’ve always done little other than to appeal to authority, and scream out words such as “REKNOWNED”,”ACCREDITED”, and “PEER REVIEWED” regarding your eco-scientist menagerie, along with perjoritives directed toward skeptics on a daily basis, why not publush your C++ source code here via text or link (prefereably with comments and perhaps YOUR algorithm) along with your open office spreadsheet template, so that it too can be reviewed? there are many people here who are software savvy, and in some cases, have access to professionals who are system architects, analysts, or IT auditors that can give it a once or twice over “smell test” for valdity and lack of self serving massaging…

The AGW skeptics commenting in my local and regional newspapers are every bit as dimwitted as yours, but I haven’t seen an exchange as unintentionally comic as this one for a while:

BitterPill: I seem to remember the industrial revolution began around 1880 with the widespread use of steam engines, maybe a bit earlier.
Yahtahei: You may be right about the timing of the industrial revolution, but steam does not emit much CO2.

The industrial revolution started long before the invention of the steam engine. New England businessmen, emulating factories they saw in England, etc. built large factories that used dams and water wheels to power machines that made thread, rope, cloth, etc. Not a chunk of coal or a wisp of steam needed. The use of assembly lines began as early as the Romans. The company that became ExxonMobil shipped product all over the world in sail-powered tankers, and delivered the product to customers in horse-drawn tankers.

Because these stations are located at different heights, they report different annual mean temperatures (Tm). But all reported temperatures were a bit higher in the couple of years after 1960 then before.

the deciated will fight, the wise will smile and let nature teach them the lesson. one could say: but all the poor people who will suffer from all this. i say and even jesus said, that only very few will be salvaged and i don’t mean this religiously, just mention it so that some of the politcally correct don’t think it takes a moron to make such a bold statement. people (we all) will get what we deserve, at least from the energetic point of view which means, that some will get away with it in this life and pay later. uhhhh… i can see how some heads start fuming now, so take it as an opinion and an attempt to put the theory into a few dozen words (not possible ) cheers and enjoy