Hansen Step 2 – First Thoughts

The source code for Hansen’s Step 2- the “urban adjustment” step is online. If anyone’s been able to operate the program through to Step 2, I’d be interested in some stage results for the stations discussed here. The verbal description is not clear and the code is a blizzard of old-fashioned Fortran subscripts, so it will take a little while to translate the procedure into modern languages and see what he’s doing.

Hansen et al 1999 says:

An adjusted urban record is defined only if there are at least three rural neighbors for at least two thirds of the period being adjusted. All rural stations within 1000 km are used to calculate the adjustment, with a weight that decreases linearly to zero at distance 1000 km.

In the stations that I’ve looked at, I’ve seen adjusted stations being calculated when the above condition doesn’t seem to hold. So any light that can be shed on the procedures would be appreciated.

Stations within 1000 km
The first step in Hansen’s Step 2 is the calculation of rural stations within 1000 km of an urban station to be adjusted. The archived script shows that this calculation is done from scratch in each run. No particular harm in that although the information presumably remains the same in every run. I’ve compiled the list of “Hansen-rural” stations for each of the 7364 stations, including in each list the id, name, lat, long, start_raw, end_raw, start_adj, end_adj, distance (from target), GISS-population, GISS-urban and GISS-lights. I archived the result at http://data.climateaudit.org/data/giss/stat_dist.tab . The information is a bit redundant to the station information lists but it was handy having some of the information directly accessible to aid analysis. The object is 33MB . The script to do this is at http://data.climateaudit.org/scripts/station/hansen/step2A.txt

In order to analyze Hansen’s actual urban adjustment mechanism, it’s nice to identify the stations with only a few contributing rural stations. There’s a very pretty R function that can do this in one line. The object containing the information stat_dist.tab is structured as a list in R with 7364 items, one for each GISS station. Each item is an R data-frame – which is structured as a matrix but the columns can be of different types. A very handy thing to be able to use. To obtain the number of rows for the 1000th station in the list, you can use the command

nrow(stat_dist[[1000]]) #86

To obtain the number of rows for all 7364 stations, all you need to do to get a vector is:

stat.length=sapply(stat_dist,nrow)

Just like magic. No blizzard of Fortran subscripts and 5 pages and programming.

In these three cases, I checked and there was no “adjusted” series for any of the 3 sites, which, in this case, complies with the Hansen et al 1999 3-rural station criterion. I then checked “U” sites with 3 neighbors, all in India, Brazil and New Zealand and again didn’t obtain any adjusted series.

For the first two sites, there was no adjusted series, but for the 3-5 series there were adjusted series. Batticaloa and Hambantota are very close and it turns out that their 3 R neighbors are identical. So based on the apparent Hansen adjustment process in which the urban station trends are supposedly coerced to the rural reference stations, one would expect similar adjusted trends. This proves not to be the case. Why – I’m not sure right now and would welcome any thoughts.

The three rural stations for Batticaloa and Hambantota are shown below – note that the distances from the two urban sites to the three rural comparanda are similar and in the same order.

The figure below shows the annual temperature values of the three rural stations. Two of them end in 1980 and only one (Minicoy) continues to the present). I test the proportion of years with at least 3 stations to the number of years of adjusted record and found that 2 of the 3 series failed the test. So it would be interesting to locate exactly where Hansen implements the 2/3 criterion in his code. I haven’t been able to do so yet. One also sees that, in this case, much depends on the Minicoy station as the only one continuing to the present.

Now for today’s puzzle – showing the dset=1 and adjusted versions of Hambantota and Batticola. How does Hansen get such different looking adjustments from identical rural comparanda? If anyone can do runs of the actual code for these stations and save any intermediate work, it would help. (Also Wellington NZ).

113 Comments

I’d suggest that, in general, stations in mountainous areas be adjusted by “nearby” stations on a much more careful basis than those on the plains (say by similar monthly trend conformity). Those on the plains should be adjusted by some sort of correlation of location based on prevailing wind direction (high weather correlation) and that station location compared to mean jet stream location be used to decide just which “nearby” stations be used for adjustments (being north of the JS in the NH is quite different than being south of the JS wrt “climate”- especially on a month to month basis).

Take a place like California (or, for that matter New Zealand, or, Central Europe). The topography is like a washboard. It would be nearly impossible without hairpulling, painful, station by station review, to have any hope that, for any given highland or lowland station, “nearby” stations would ever be all similarly highland or lowland, let alone have the same general climate characteristics as other highland or lowland stations. As an added complication, in a place by California, we actually have Marine West Coast, Mediterranean, Midlatitude semi arid and arid and Subtropical Arid all in one state. One part of the state could be getting blasted by cP while the other part is kicking back in mT. Personally, I think the whole thing is completely discredited unless you happen to be sitting equidistant from the Rockies and Appalachians, or in the middle of the Eurasian steppes. But in most places, you’re into a radically different climate within 1000 Km.

6, that’s supposed to be in there. The bigger question is, what do these stations have to do with each other? For example, let’s take Los Angeles, San Fransisco, and Death Valley. Death Valley has more of an influence on LA than SF does under Hansen’s scheme. Which two of the three do you think are actually more related?

More bafflegab from Hansen. It looks like Hansen is playing the community of skeptics like a violin, metaphorically a replica of the one Nero used.

Using stations as much of 1000km apart is absurd.

How about generating a trend from all of the stations used in the Global analysis individually. First, just the raw data. Then redo the trend-lines using various adjustments TOB, site and instrumental changes, FILNET, etc. Then do a statistical analysis of the trend-line slopes,and then a climatological (not a statistical) analysis of meteorological factors affecting each site.

My guess is that a major percent of the useful information will be derived from the raw data directly. At the very least, interested observers should get a chance to view the raw data before less than objective alterations are made.

Especially when the weighting factor is linear. I seems to me that it would make more sense to do some sort of inverse square of the distance as a weighting factor which would give much more weighting to rural stations which are well within the 1000km and much less to the stations near the 1000km distance.

They only adjustments that are made for altitude are made when stations change
Altitude.

I think you misunderstand what Hansen is doing. I will illustrate in a simple
simple case.

Assume a station at Sea level. Assume its temperature never changes. call it X
Now, Measure the temperature 1KM above this same spot. The ABSOLUTE temp
at 1KM higher will be X-6.5C lower due to the lapse rate.

The question hansen asks is not what the absolute temp is, but rather what
is the trend is. So, in his model you do not need to correct for altitude
EXCEPT for those cases where the altitude changes during the recording.
So, if my sea level site is moved up to 500M then I need to adjust. And
if it moves up to 1000m then I need to adjust again.

He isnt after absolute temp. He is after the change. For a while I played with
removing altitude biases, Trends dont change obviously, but errors get smaller.

Trend trend trend. That is all Hansen cares about. Its not about causation.
His 87 study claims a correlation .5 R2 at 1200km.. Thats the analysis that needs to be
ivestigated. Clearly 1200km or 1000km is not the best in all circumstances,
in fact Hansen noted a difference between NH correlation at 1200km and SH
correlation.

Weather morphs into climate. I’m not sure of the dimensions of the average weather system (may vary strongly by latitude), but it would seem to be on the order of 300 to 600 mi. (500 to 1000 km) radius. That’s a off the top of my head estimate, I’ll have to do slightly more study is even come up with an educated cocktail napkin estimate.

“The ABSOLUTE temp at 1KM higher will be X-6.5C lower due to the lapse rate.”

That’s assuming you move straight up in open space. But is that the same as moving to a location on land which is at the same elevation? Surely there are differences in temperature measured in free space at a particular altitude above sea-level, and a temperature measured on land that is at the same elevation. Does altitude adjustment account for this difference?

From what I understand lapse rates do vary from pace to place.
But Hansen uses an estiamte of average Lapse rate. He uses 6C
per km. Again,suppose you have a station on the land at sea
level and for 50 years temp is recorded there. Lets say the
average is 14C. Then they move the station, up a hill, say 100M
high. Average lapse rate would suggest that the site would then
report temps of 13.4C. Hansen uses altitude to adjust for Station
changes only. He would Cool the first 50 years of the record by
.6C.

Since he is looking for trend and not some absolute temp the approach
makes sense. otherwise you would see a false non climatic cooling
at that station. The opposite would happen if you moved the site
the down the hill later.

The weather is different to be sure, but Hansen is interesting in the
long term climate and how the averages change over time.

So what matters to him is how things average out. A simple
example. Lets say that Duluth had an average jan temp of 8C
in 1900 and in 2006 it averaged 9C. While St Louis was 12C
in jan of 1900 and 13 C in Jan of 2006. Hansen “averages”
these figures to see what the average trend in temperature is.

Thanks for your explanation, but my question was really whether a lapse rate applied to “elevated land” is legitimately applied. Lapse rate is defined for open space atmosphere, is it not? But doesn’t the atmosphere somewhat follow the contour of the land? In that regard, is lapse rate applied correctly when it is applied to a land surface that is elevated?

I need someone to explain this to me, if the urban heat island effect is so significant as to require adjusting the temperature data, why use it at all? Why not run only the rural data without the urban areas and use that as a baseline for a graph? Then run the urban areas and compare the two to determine the heat island effect??? Has anyone done this, I would like to see what the real GAT graph looks like without all this adjusting and tweaking. While someone would object to not having all the areas (60 x 60 mile squares) to determine the GAT, having corrupted data is worse than having none at all. At least when you average the data you have you can account for the areas you don’t have, think of it the way Micrsoft Excel averages a group of cells, if a cell is blank it disregards the cell to determine the average or mean depending on the function.

Then I’m confused by graphs for Hambantota which appear to be Celcius. Is Hansen is using the trends to adjust the actual temps then creating the trend for the uban station from that or is he using the trends to adjust the trend of the urban station directly?

No matter how he’s doing it, it doesn’t seem reasonable assume that the trends at different lattiudes and altitudes would be the same trend, except over very long time scales (much more than Hansen has data for).

It appears that you can output XML but would have to install an add on package, then define a DTD for your output and use that to tell R how to generate the XML…from there you have a file that you can manipulate with xslt, not too mention having your output in the most flexible cross platform/application format around.

We analyze surface air temperature data from available meteorological stations with principal focus on the period 1880-1985. The temperature changes at mid- and high latitude stations separated by less than 1000 km are shown to be highly correlated; at low latitudes the correlation falls off more rapidly with distance for nearby stations. We combine the station data in a way which is designed to provide accurate long-term variations. Error estimates are based in part on studies of how accurately the actual station distributions are able to reproduce temperature change in a global data set produced by a three-dimensional general circulation model with realistic variability. We find that meaningful global temperature change can be obtained for the past century, despite the fact that the meteorological stations are confined mainly to continental and island locations. The results indicate a global warming of about 0.5-0.7°C in the past century, with warming of similar magnitude in both hemispheres; the northern hemisphere result is similar to that found by several other investigators. A strong warming trend between 1965 and 1980 raised the global mean temperature in 1980 and 1981 to the highest level in the period of instrumental records. The warm period in recent years differs qualitatively from the earlier warm period centered around 1940; the earlier warming was focused at high northern latitudes, while the recent warming is more global. We present selected graphs and maps of the temperature change in each of the eight latitude zones. A computer tape of the derived regional and global temperature changes is available from the authors.

It looks like all of the logic for processing the rural stations within 1000 km of an urban station is in PApars.f, which is in the STEP2 subdirectory. Here are the comment lines at the top of the FORTRAN listing:

C*********************************************************************
C ***
C *** Input files: 31-36 ANN.dTs.GHCN.CL.1 … ANN.dTs.GHCN.CL.6
C ***
C *** Output file: 78 list of ID’s of Urban stations with
C *** homogenization info (text file)
C *** Header line is added in subsequent step
C*********************************************************************
C****
C**** This program combines for each urban station the rural stations
C**** within R=1000km and writes out parameters for broken line
C**** approximations to the difference of urb. and comb.rur series
C****
C**** Input files: units 31,32,…,36
C**** Record 1: I1,INFO(2),…,INFO(8),I1L,TITLE header record
C**** Record 2: IDATA(I1–>I1L),LT,LN,ID,HT,NAME,I2,I2L station 1
C**** Record 3: IDATA(I2–>I2L),LT,LN,ID,HT,NAME,I3,I3L station 2
C**** etc. NAME(31:31)=brightnessIndex 1=dark->3=bright
C**** etc. NAME(32:32)=pop.flag R/S/U rur/sm.town/urban
C**** etc. NAME(34:36)=country code
C****
C**** IDATA(1) refers to year IYRBEG=INFO(6)
C**** IYRM=INFO(4) is the max. length of an input time series,
C****
C**** INFO(1),…,INFO(8) are 4-byte integers,
C**** TITLE is an 80-byte character string,
C**** INFO 2 = KQ (quantity flag, see below)
C**** 3 = MAVG (time avg flag: 1 – 4 DJF – SON, 5 ANN,
C**** 6 MONTHLY, 7 SEAS, 8 – 19 JAN – DEC )
C**** 4 = IYRM (length of each time record)
C**** 5 = IYRM+4 (size of data record length)
C**** 6 = IYRBEG (first year of full time record)
C**** 7 = flag for missing data
C**** 8 = flag for precipitation trace
C****
C**** The combining of rural stations is done as follows:
C**** Stations within Rngbr km of the urban center U contribute
C**** to the mean at U with weight 1.- d/Rngbr (d = distance
C**** between rural and urban station in km). To remove the station
C**** bias, station data are shifted before combining them with the
C**** current mean. The shift is such that the means over the time
C**** period they have in common remains unchanged. If that common
C**** period is less than 20(NCRIT) years, the station is disregarded.
C**** To decrease that chance, stations are combined successively in
C**** order of the length of their time record.
C****
C

“An adjusted urban record is defined only if there are at least three rural neighbors for at least two thirds of the period being adjusted. All rural stations within 1000 km are used to calculate the adjustment, with a weight that decreases linearly to zero at distance 1000 km.”

Reads to me that rural data is being used to adjust urban
data.

It seems I’m not the only one suggesting that the whole notion is confusing.

It seems that the authors of #1, 2, 7, 9, 10 12, and 24 are laboring under
the same misunderstanding, if it is inded one.

#22 brings up a good point altitude. How can temp readings from Denver Co, the mile high city be incorporated into the GAT when air temp drops 3 to 5F per 1000 ft? Is Denver’s air temp adjusted up by over 15F to fit the rest of the US at zero ft to sea level?

#33. OK, got it. In the Christchurch NZ example, the count of rural stations hits 3 in 1951 and the adjusted version begins then. The number of rural stations goes to decreases to 2 in 1989 and 1 in 1991.

IT seems to be asymmetrical: the adjusted version starts only when there are 3 rural stations, but it will continues with only 1 rural station. The critical value – as I read it, is defined only over the period from the first year with 3 stations to the last year with 3 stations and in effect measures the proportion f intermediate years when the number dips below 2 and has no QC check against unavailablility of late rural stations.

I’m sure this is intentional as there is a widespread change in character of the dataset to more urban stations after 1990.

#35. Orland is classified as both “R” and “bright”. In the US, he would adjust it as a “bright” site so that would explain your conundrum. The unlit criterion doesn’t seem to work too badly in the U.S. In effect, the only stations that appear to “matter” for the U.S. trend are the “unlit” ones – which is why Hansen’s US trend is different from NOAA and CRU. I don’t get the sense that his R-criterion in the ROW is working as well.

“Thanks for your explanation, but my question was really whether a lapse rate applied to elevated land
is legitimately applied. Lapse rate is defined for open space atmosphere, is it not?
But doesnt the atmosphere somewhat follow the contour of the land?
In that regard, is lapse rate applied correctly when it is applied to a land surface that is elevated?”

Well color me yellow and hit me cross court. I can only tell you
what Hansen does and the general theory behind it. I’m not defending
his approach ( he does that just fine) I’m just trying to burn down
strawmen. You’ll gain no ground if you mischaracterize or misconstrue his
positions. ( psst sometimes I do that for fun just to piss em off)

Now, I have some issues with Hansen’s Lapse rate adjustments. He documents
two of them. It’s way down the list of things to focus on. BUT knock
yourself the hell out. There is tons of stuff to look at it.

I believe he makes mentin of it in his 99 paper. Bug me more if you cant
find it and I post the quotes. or just search on CA.. my name and st. Helena

RE: #7 – SF is in Sunset Western Garden Guide’s Climate Zone 17, and LAX is in Climate Zone 24. Death Valley is in, I believe, Zone 12 (would need to double check it). 17 and 24 are actually the two most similar Climate Zones in that they both are heavily marine influenced, Mediterranean and have the warmest winter temps for their latitude bands. In 17 the coastal stratus “feels” colder as do onshore winds. Both are classic California “beach climates.” Both are nearly frost free. 17 generally gets more rain (areal range 15 – 60 in) than 24 (areal range 9 – 40 in). There is a higher incidence of Santa Ana (compressive, dry, warm, offshore wind) outbreaks in 24. Death Valley is like another planet in comparison to either. Hottest warm season temps in CONUS for many days out of the year, zero marine influence and almost no rain or RH.

Does anyone know if R has been formally adopted for use as Open Source Software in a DOD or a DOE project somewhere?

Further research indicates R is used quite extensively in DOD. Therefore, it likely has official standing as an O.S.S. software that has been developed and maintained using an acceptable software lifecycle development methodology.

ORLAND.. BRIGHT? Well yes, if you Look at Hwy 5 and the crap right around
the exit there, yes, you have lights. Lights but no fricking people.
Oh, I forgot about the Olive stand. 2 people selling Olives. and a dog.
with three legs. and half a dozen fleas.

I wonder if its not time for someone here to contemplate auditing the data being produce by NSIDC and Cryosphere Today especially in the light of changes done earlier this year and just recently to antarctica (without any valid explanation, except “software glitch”.

Hansen was trained in physics and astronomy in the space science program of Dr. James Van Allen at the University of Iowa. He obtained a B.A. in Physics and Mathematics with highest distinction in 1963, an M.S. in Astronomy in 1965 and a Ph.D. in Physics, in 1967, all three degrees from the University of Iowa. He participated in the NASA graduate traineeship from 1962 to 1966 and, at the same time, between 1965 and 1966, he was a visiting student at the Institute of Astrophysics at the University of Kyoto and in the Department of Astronomy at the University of Tokyo. . .

[edit] Field of research and interests

As a college student in the University of Iowa, Hansen was attracted to science and research by James Van Allen’s space science program in the physics and astronomy department. A decade later, he started focusing on planetary research that involved trying to understand the climate change on earth that will result from anthropogenic changes of the atmospheric composition.

One of Hansens research interests is radiative transfer in planetary atmospheres, especially interpreting remote sounding of the earth’s atmosphere and surface from satellites. Such data, appropriately analyzed, may be one of the most effective ways to monitor and study global change on the earth.

Dr. Hansen is also interested in the development and application of global numerical models for the purpose of understanding current climate trends and projecting humans’ potential impacts on climate.

—————————————————–

It sounds like that Hansen began to suspect (invent?)AGW in the early 70s. That’s his story and he’s sticking to it.

I must be reading the wrong input file.
Can anyone point me to, or provide, the correct input file?
Also, there must be other input files to Step 2, which I do not find in Hansen’s release. Can any one provide those?

As a part-timer more able in tactics than in stats, I’m getting confused by the terminology the the various data sources and their adjusted acronyms world-wide.

I’m puzzled by Wellington. Approximately, both North and South Island in NZ are 1000 km long. Wellington sits in the middle, so all stations are within 1200 km of Wellington. So is a lot of sea, so is a landform change from mediterranean sea-level plains to cold alpine glacial. East coast climate is rather different to west coast. The detail required to apply altitude lags etc to even this small country is formidable. The raw climate records go back 100 years in many places. Meta data should be fairly good. There are few large cities, so UHI corrections should be able to be applied in considerable detail. In short, it’s a neat little laboratory country. Nearly every one of perhaps 100 stations should be available as rural.

How about using NZ as a showcase example of how adjustments affect raw data? I see more and more writers seeking this type of comparison.

Some say the world will end in fire,
Some say in ice.
From what I’ve tasted of desire
I hold with those who favor fire.
But if it had to perish twice,
I think I know enough of hate
To say that for destruction ice
Is also great
And would suffice.

It kind of makes sense that someone who is successful at modeling the greenhouse effect on Venus will want to take what he learned there, and move on to Earth. The only problem is that there’s no phase changes on Venus, which makes it a much, much easier problem. That also means there’s no feedback effect.

Moving on to the Earth is non-trivial, and I think that we can all draw our own conclusions regarding what happened when he tried.

I think he joined Richard Shockley and Linus Pauling and a long list of others who refused to recognize when they were out of their depth.

Climate Change: Did NASA scientist James Hansen, the global warming alarmist in chief, once believe we were headed for . . . an ice age? An old Washington Post story indicates he did.

On July 9, 1971, the Post published a story headlined “U.S. Scientist Sees New Ice Age Coming.” It told of a prediction by NASA and Columbia University scientist S.I. Rasool. The culprit: man’s use of fossil fuels.

The Post reported that Rasool, writing in Science, argued that in “the next 50 years” fine dust that humans discharge into the atmosphere by burning fossil fuel will screen out so much of the sun’s rays that the Earth’s average temperature could fall by six degrees.

Sustained emissions over five to 10 years, Rasool claimed, “could be sufficient to trigger an ice age.”

Aiding Rasool’s research, the Post reported, was a “computer program developed by Dr. James Hansen,” who was, according to his resume, a Columbia University research associate at the time.

So what about those greenhouse gases that man pumps into the skies? Weren’t they worried about them causing a greenhouse effect that would heat the planet, as Hansen, Al Gore and a host of others so fervently believe today?

“They found no need to worry about the carbon dioxide fuel-burning puts in the atmosphere,” the Post said in the story, which was spotted last week by Washington resident John Lockwood, who was doing research at the Library of Congress and alerted the Washington Times to his finding.

Hansen has some explaining to do. The public deserves to know how he was converted from an apparent believer in a coming ice age who had no worries about greenhouse gas emissions to a global warming fear monger.

This is a man, as Lockwood noted in his message to the Times’ John McCaslin, who has called those skeptical of his global warming theory “court jesters.” We wonder: What choice words did he have for those who were skeptical of the ice age theory in 1971?

People can change their positions based on new information or by taking a closer or more open-minded look at what is already known. There’s nothing wrong with a reversal or modification of views as long as it is arrived at honestly.

But what about political hypocrisy? It’s clear that Hansen is as much a political animal as he is a scientist. Did he switch from one approaching cataclysm to another because he thought it would be easier to sell to the public? Was it a career advancement move or an honest change of heart on science, based on empirical evidence?

If Hansen wants to change positions again, the time is now. With NASA having recently revised historical temperature data that Hansen himself compiled, the door has been opened for him to embrace the ice age projections of the early 1970s.

Looks like the good doctor was in on the Great Coming Ice Age fear mongering of the ’70’s.

Excerpt:

On July 9, 1971, the Post published a story headlined “U.S. Scientist Sees New Ice Age Coming.” It told of a prediction by NASA and Columbia University scientist S.I. Rasool. The culprit: man’s use of fossil fuels.

The Post reported that Rasool, writing in Science, argued that in “the next 50 years” fine dust that humans discharge into the atmosphere by burning fossil fuel will screen out so much of the sun’s rays that the Earth’s average temperature could fall by six degrees.

Sustained emissions over five to 10 years, Rasool claimed, “could be sufficient to trigger an ice age.”

Aiding Rasool’s research, the Post reported, was a “computer program developed by Dr. James Hansen,” who was, according to his resume, a Columbia University research associate at the time.
* * *
“They found no need to worry about the carbon dioxide fuel-burning puts in the atmosphere,” the Post said in the story, which was spotted last week by Washington resident John Lockwood, who was doing research at the Library of Congress and alerted the Washington Times to his finding.

Hansen has some explaining to do. The public deserves to know how he was converted from an apparent believer in a coming ice age who had no worries about greenhouse gas emissions to a global warming fear monger.

The public deserves to know how he was converted from an apparent believer in a coming ice age who had no worries about greenhouse gas emissions to a global warming fear monger.

My guess is that he is going to push whichever climate agenda would bring climate scientists greater notoriety and influence (not to mention the increases in research grants). He was trying to make a name for himself then and he is trying to make a name for himself now. The one thing I am sure he would never go for is the notion that everything is going to be ok and there really is nothing that people can do about it. Notice that in both cases it was human caused. Humans increasing the particulates in the former, and humans with CO2 in the latter. I don’t believe he would ever go along with the idea that climate varies according to factors outside the control of humans (evidence of past climate variation notwithstanding). He wants influence, plain and simple. If there is little that people can do about it, he can have little influence in their actions (or be of little support to those who might want to have such influence).

You guys are getting a little freaky. A little too much Hansen on your brain. I would recommend getting back to analyzing the data. That is honest work.

In any case, I’m guessing that Hansen doesn’t drive the debate anymore. The Arctic ice extent data is far more influential in the public eye than the surface temperatures these days. A program of auditing the NSIDC seems just as important.

65, OTOH, the Arctic ice is a much easier argument. It has little to do with air temperatures, and everything to do with this:
In other words, it’s not related to global warming, it’s related to oceanic anomalies.

And at the same time that this is happening, Antarctic ice is at record levels, or at least it was until they Hansened the data (did I just coin “Hansen” as verb?).

Are you the same deech56 that claimed over on Rabbett that SteveMc wrote a post on Tamino?”

Why yes, that would be me. Thanks for asking. Yep, my bad. But the points I was trying to make were: 1. Are reports that do not come to same conclusion as the consensus given the same level of scrutiny around these parts as the ones by Hansen, Mann and that crowd?, and 2. What’s the context for all of this?

I still wonder what the purpose of this “auditing” is. The main question should be “Is the temperature of the earth changing?” The second question should be “How do we use the imperfect information we have to find out if the earth’s temperature is changing?”

You can speculate about the motives of Dr. Hansen all you want (talk about a faith-based effort) but these questions loom larger than any of this.

First, generally speaking there are folks who like the details and folks who like the
big picture or “context ” as you described it On Tamino’s site. I see value in both.
You seemed intent on criticizing SteveMC for not discussing Schwartz, and then tried
to say ” he had time enough to Post on Tamino” Spend some time here. If you steer
clear of the ad homs you will have some fun.

So, you got the facts wrong. happens. The other thing you miss, (not your fault because you don’t hang here)
is the importance Tamino’s Discussion of AR(1) has to this community and its history.

generally speaking, if you look around ( like on unthreaded) you’ll see that discussions of
C02 and climate sensitivity are put on the back burner. Why? Because SteveMc has been looking the
right article. He asked Gavin for something. No response.

Another thing. There is a certain proclivity here to test what is accepted. I’ll give you
and example. Parker’s paper on UHI. Everybody cites it, but there are major issues with it.
Same with Peterson’s paper on UHI. Like Gavin says, peer review is necessary but not sufficient.
We like to to the suffiecient part.

So, when schwartzs paper comes out, and he’s got all these caveats, and it’s quickly rejected
as wrong ( Annan was on it pretty fast and pointed out bonehead things) Then where’s the fun
in that?

SO. Onto your reply for some loose ends:

” 1. Are reports that do not come to same conclusion as the consensus
given the same level of scrutiny around these parts as the ones by Hansen, Mann and that crowd?, and ”

Personally, I don’t read much “denialist” stuff and figure RC and company can handle it just fine.
Once in a while folks here have posted stuff of that nature. If the claims are bogus that
stat guys who dominate here will throw the BS flag.

“2. Whats the context for all of this?”

I think SteveMC put it best when he talked about crossword puzzles.. Thats his slant. If
you start with his hockey stick studies you’ll see how its all connected.. a network
of ideas.

1. Misuse of statitsics by people who won’t consult experts. ( MANN)
2. The refusal to share data and methods.
3. The temperature record that Mann used.. its data and methods.
4. Other proxies, then

So it grows from that seed. Not so hard to understand. The C02 stuff becomes
interesting if there is a neat statistical issue.

“I still wonder what the purpose of this auditing is. ”

Why do you balance your checkbook?

“The main question should be Is the temperature of the earth changing? ”

Of course its changing. Changing as we speak. You mean to ask ( detail thing
again) is there a temperature trend that is increasing, and increasing
in a way that is abnormal, and how confident can we be in this?

“The second question should be How do we use the imperfect information we have to find out
if the earths temperature is changing? ”

Well, first we ask people who are experts in imperfect information. Dr. Hansen
is not a professional statistican. His programmers are not software development
experts. He is not an economics expert. So, before we decide that we MUST
use this imperfect information, we characterize it. We audit it. We see how
much we know, and how well we know it.

You can speculate about the motives of Dr. Hansen all you want (talk about a faith-based effort) but these questions loom larger than any of this.

Re: 65 and 66
Erik
In general terms, I agree with your first observation and suggestion.
I have also made the point that Larry highlights in #66 more than once on different blogs. Based on best available data and understanding of the cylcical nature of what’s going on in the Arctic, the ratio [in terms of actual effect] of ocean water temps vs. air temps/albedo, etc., is as high as 80:20. With the onset of a La Nina in the Pacific basin and the Atlantic due for shift from warmer to colder, it will most interesting indeed to see how long the ice anomalies in the Arctic persist. Meanwhile, mainstream media are studiously avoiding the fact that Antarctica is refusing to follow suite and is not warming.

OK, I am embarrassed to ask this question, but where is the raw data at? I have some algorithms developed for examining time-series data for anomolies that could be applied to this data, I think. Another question, does the raw or adjusted, I don’t care for now, data contain 365 days, twice daily? If the data is in a binary format, is it documented anywhere?

“The huge warming of the Arctic, which started in the early 1920s and lasted for almost two decades, is one of the most spectacular climate events of the 20th century. During the peak period of 1930-1940, the annually averaged temperature anomaly from the area 60°N-90°N amounted to around 1.7°C.”

“Rapeseed and maize biodiesels were calculated to produce up to 70 per cent and 50 per cent more greenhouse gases respectively than fossil fuels. The concerns were raised over the levels of emissions of nitrous oxide, which is 296 times more powerful as a greenhouse gas than carbon dioxide. Scientists found that the use of biofuels released twice as much as nitrous oxide as previously realised. The research team found that 3 to 5 per cent of the nitrogen in fertiliser was converted and emitted.”

Thanks to Gore and Hansen food is going to be a lot more expensive because they wants us to burn corn … and it will produce more greenhouse gasses.

If the graph in #66 is to be believed, the anomalies are now 5 degrees C. This seems to be off the chart in terms of excursions from normal temperature. Beats looking for .5 degree anomalies with .2 degree errors.

Are you saying that that the two anomaly values between now and 1930’s are similar? I don’t think so, and I don’t think the Northwest Passage was freely open in the 1930’s as it is now. At least no successful voyages were reported.

I’ve been trying to promulgate the viewpoint in this blog that the signal for warming for both ocean and land temperatures in the Arctic is indisputable and unprecedented in more than 150 years. I challenge the audit team to prove me wrong. Going after the temperate zone data just seems painfully difficult in comparison – all those damn stations and their UHI problems.

Another bonus is that you don’t have to worry about Hansen. This isn’t his area of expertise.

#77. On an earlier occasion, http://www.climateaudit.org/?p=1266 , I looked at HAnsen’s 64N-90N compilation. Levels from the 1930s were very high. He had made 1937-38 cooler in the past 10 years, and 2005 was the first year that had exceeded levels in the 1930s. Can someone check to see whether NOAA had data in the 1930s.

Yes. Nitrous oxide is actually quite stable, unlike NO and NO2. As a ghg, the strong absorption at 7.8 micrometers is most significant. It is a good oxidizer at high temperature. For elements with stable oxides, a nitrous oxide/acetylene flame is the preferred choice for flame atomic absorption/emission spectrometry. Of course, almost everybody uses plasma emission now.

The area of ice in the Greenland Sea in AprilAugust of 19211939 was 1520% less than in 18981920 (data of Karelin).

In the Barents Sea the area of ice was 12% less in 19201933 than in 18981920 (data of Zubov).

Vise pointed out that since 1929 the south part of the Kara Sea in September was free of ice, while in 1869 1928 the possibility of meeting ice there in September was about 30%.

The polar ice very often came close to the coast of Iceland in the last century and in the beginning of this century. During 19151940 the situation changed: no ice was observed in that region; negligible amounts of polar ice were noticed there only in 1929.

The thickness of ice determined during the Fram cruise was 655 cm; during the Sedov cruise it decreased to 220 cm (the reason for this was more intensive summer melting of ice).

Before Arctic warming, the strait of Jugorsky Shar froze near the 24th of November, but in 19201937 it became frozen two months laterin January.”

“The sailing conditions in the Arctic region became much more favorable in 19201940. This can be proved by the following cruises:

The severe conditions of navigation in previous years can be proved by the following cruises:

* In 1912, the ship Foka, a member of the Sedov expedition, could not reach Franz-Joseph Land.
* In 1912, the ship St. Anna, a member of the Brusilov expedition, was trapped in ice near Yamal and carried out with the ice to the central Arctic.
* In 1901, the icebreaker Ermak failed to double Novaya Zemlya.”

#74, Erik Ramberg said: “I dont think so, and I dont think the Northwest Passage was freely open in the 1930s as it is now. At least no successful voyages were reported.”

Well, who knows. The absence of passages are not evidence that it was unpassable. It was possibly more an indicator that the passage
had been tried and reported to be somewhat challenging. Maybe the world was experiencing something that made the challenge of the
unknowns in the Arctic less tantalising than before? Somekind of…recession…perhaps? Who knows.

In 1940, the Passage was navigated successfully by the St.Roch, a Canadian vessel. No easy task for sure, but still passable.

Still, it can’t be taken as evidence that the icecover was larger or smaller than today. Just that someone actually made the passage.

As I gather from scarce news reports, even today the Passage is not a very pleasant or easy journey to make. Although there may be
less ice, the waters hold other challenges as well.

#66 Basically the cooling waters of the oceans even
in NE Atlantic and outside the Iberian peninsula etc
are pushing the warm water north!? If the Arctic would
have been a land mass just like Antarctica things
would have been somewhat different. Remember though that
Vostok cold record is from 1983 and the following years
were among the coldest in NW Europe for decades This winter
of course “only” -81C as compared to -89C in 1983 in Vostok!
Nothing repeats exactly of course…Here in Sweden
though, 1990 Jan and Feb “heat” is not exceeded since.
Even the “Wasalopp” was cancelled…

Global mean surface temperature has increased since the late 19th century. The warming occurred largely during two periods: 1920-1940, and since the mid-1970s. Although most recent studies have focused on the latter period, it is of interest to analyse the earlier period and compare its major features to the recent warming episode. The warming during 1920-1940 occurred most rapidly during the 1920s. It was strongest at high northern latitudes in winter, a pattern now believed to be characteristic of greenhouse warming. This warming of the Arctic was much discussed during the 1930s and 1940s, but the data available at that time were mostly derived from land areas.

Hello all: The codes and the maths leave me a little dizzy. I’m confident with the politics though
It seems clear to me the advocates for GW do tend to fall into the left camp, more so than skeptics falling into the right-wing category
The stuff about Hanson being and ice-ager does not surprise me why? Well there seems to be a type of person from the developed world,a spoilt very well brought up with an affluent background sort of person, who just seem to hate what their societies seemingly stand for. They then join the left. The left like to associate any other group that proclaims to hate the developed west. This phillosophy leads to a self-loathing and as a result all history is distorted into a conspiracy of captitalism. Of course to maintain this conspiracy the left only use the last 200-300 years. It is totally lost on them the fact that humans have only recently in historical terms seriously developed into the world the leftnow take for granted in the last 100 years: Hence Hanson os off that ilk hates the west and looks for any angle to show that the west is wicked and is going to destroy the planet that we live on. You only have to look at his grandiose statements like “stewardship of the planet” to see his feelings of self importance. Not science per se but many do have an axe to grind one way or the other it’s a driving force factor.

1)Steve do feel free to pull this post if it seems detrimental to the sterling work you Anthony and others have done so far.
2) Is the aim eventually to see if the Hansen codes were biased?
3) The fact that many of the skeptic sources are not seemingly giving you just credit by at least having the courtesy to highlight your hard work does seem rather mean.

RE: “The polar ice very often came close to the coast of Iceland in the last century and in the beginning of this century. During 19151940 the situation changed: no ice was observed in that region; negligible amounts of polar ice were noticed there only in 1929.”

Notably, the renewed occurrence of ice bridges between Iceland and the main sea ice mass during winter and spring has been observed since the turn of the century. Sales of arms and ammo meant for dangerous game have gone through the roof. Icelanders are not very worried lately about the demise of the polar bears.

Last I heard, the current shipping season is about 100 days. Yet more evidence that it was quite warm in the Arctic from 1920 to 1940. A cyclic shift in ocean currents still seems the simplest explanation for the loss of ice then and now.

In 1942 the St. Roch made a return vouage (Halifax to Vancouver) through the North West Passge. This trip took a more northerly route than the first one. Very little ice was observed and the trip took only 86 days.

This anecdote does show that at least in eh summer of 1942 the Canadia artic was free of any signficant ampunt of ice.

On the basis of half a dozen years the jump before and after winter 1918/19 is about 8°C. Comparing only January/February of 1917 and 1918, with January/February of 1919 and 1920 the temperature jump is almost plus 10°C.

I bought the article that S.I. Rasool and Stephen Schneider published in Science in 1971. That article and the Washington Post story were published on the same date  July 9, 1971  which indicates that the Post reporter had received the article several days earlier and had already written his story, but under an embargo agreement could not publish it until it appeared in Science. It’s a routine practice with press releases. That gave the reporter time to interview Rasool, who gave him some dramatic quotes.

Rasool and Schneider argued that an 8-fold increase in CO2 would only increase global temperature by 2°C, because the temperature response to increasing CO2 is logarithmic, that is, each successive increment of CO2 has less effect than the one that preceded it, so the temperature response curve becomes progressively flatter. In contrast, they argued, the cooling effect of aerosols produced by industrial activity increases exponentially as aerosol concentration increases; that is, the response curve gets progressively steeper. Rasool and Schneider wrote,

An increase by a factor of 4 in the equilibrium dust concentration in the global atmosphere, which cannot be ruled out as a possibility within the next century, could decrease the mean surface temperature by as much as 3.50K. If sustained over a period of several years, such a temperature decrease could be sufficient to trigger an ice age!

The exclamation mark is in the original. This is the only time I have ever seen it used in a peer-reviewed scientific journal article. I’m surprised the reviewers let them get away with it  but then this is Science, a publication fond of climate-scare papers and prone to hype them when they come along. No wonder the Post reporter was interested.

The Science article shows that James E. Hansen not only developed the climate model Rasool and Schneider used to show that the cooling effect of aerosols could trigger a new ice age, he participated directly in their study. His contribution is acknowledged in footnote 16:

16. J. E. Hansen, personal communication. We are indebted to Dr. Hansen for making these Mie scattering calculations for us, for suggesting the use of the two-stream approximation, and for checking the fluxes obtained by the two-stream approximation against some exact solutions (which agree to within about 5 percent) to the multiple scattering problem….

It’s commonly said by global warmists that the Ice Age scare of the 1970s was a media concoction, limited to magazines like Newsweek and Time and not taken seriously by scientists. The Science article shows that it was taken seriously by some of today’s most prominent global warming alarmists.

Temperatures for grid cells that have no weather stations are “calculated” by reference to cells that do, and temperatures in urban heat islands are “adjusted” by reference to cells up to 1000 km away. Don’t these maneuvers have the same effect statistically as reducing the sample size?

RE: #96 – I strongly disagree with the “chunk breaking off” idea. The main pack in fact reached Iceland this past season. Do note that Cryosphere Today underreports extent of any ice that is less than 60 or 70% concentration and also has significant underreporting problems at the ice edge. These issues are artifacts of reliance on the flawed passive microwave remote sensing techniques used by the raw data (satellite) collection network used at NSIDC (which is where CT get their feed, although CT’s interpretation algorithms are claimed to be different than NSIDC’s).

Rasool and Schneider said in their Science
article that their parameters, “are described further by Hansen and
Pollack (17).” Footnote 17 references a paper by J.E. Hansen and J.B.
Pollack titled “Near-Infrared Light Scattering by Terrestrial Clouds”
that was published in the Journal of
the Atmospheric Sciences in March 1970. That is probably the
Hansen model Rasool was referring to in his statements to the Washington Post reporter. The
abstract is here.

This has probably been said before, but it’s worth repeating. The Rasool and Schneider paper looked at the effect of quadrupling aerosols and found that it would cause severe cooling. For their analysis they used a program created by Hansen for analysis of clouds on Venus.

What part of that is controversial?
Do aerosols cause cooling? Yes.
Can software be written for one purpose and used for another? Yes.

Dear Sir,
I need script to find out/measure Urban Heat Island effect over Karachi-Pakistan. My data is monthly and annual basis. I am using mean monthly, mean maximum and mean minimum temperature of Karachi for Two observatories: One Located in city centre (Airport) and second outside city (about 30km away from the city centre)
I shall be thanks if some one guide me that how can I make the script in Linux for the Fortran 90…
best regards