Infrastructure encroachment at a NOAA Climate Reference (USCRN) site makes the data warmer

Poorly sited stations in the U.S. Historical Climate Network (USHCN) have been documented extensively at WUWT in many posts and by the Surface Stations project Anthony started over a decade ago.

The USHCN is a 1,219 station subset of the much larger NOAA Cooperative Observer Program (COOP) Network that was selected for baseline monitoring of long-term temperature trends in the contiguous United States. See https://www.ncdc.noaa.gov/ushcn/introduction for more information.

Between 2007 and 2012 dozens of Surface Stations project http://www.surfacestations.org/ volunteers surveyed 82.5% of the network to document just how poorly it conformed to specifications. The effort resulted in publication (Fall, et al. 2011) of a strong critique of siting deficiencies that bias reported temperatures and call into question the suitability of USHCN data for climate research.

Even with so many poorly sited stations, is it possible to use the well-sited ones to evaluate the temperature record? A comprehensive analysis of the environment immediately around the sensors using on-site station photography, satellite and aerial imaging, street level Google Earth imagery, and detailed evaluation of station metadata found only 410 USHCN stations with minimal artificial influences during the 1979-2008 period. The study by Anthony Watts and Evan Jones of surfacestations.org, John Nielsen-Gammon of Texas A&M, and John R. Christy of the University of Alabama – Huntsville was presented at the 2015 AGU meeting in San Francisco, CA. Among other conclusions, the data suggest

“that the divergence between well and poorly sited stations is gradual, not a result of spurious step change due to poor metadata.”

Upon the realization that the quality of essential data was suspect even before Surface Stations made it glaringly obvious, NOAA undertook an effort to build a network that would be contaminated minimally by microsite influences. Thorne et al. (2018) gives a brief introduction to the history and challenges of meteorological observations at ground stations. It also proposes a conceptual framework for a truly fiducial (i.e., trustworthy) network, saying it “would provide measurements that are meteorologically traceable, with full metadata. It would not only provide unambiguous high-quality time series but serve to validate and enhance trust in the quality of the other networks…”

The U.S. Climate Reference Network (USCRN), deployed beginning in 2002, is given as an example of a national system of well-sited stations with modern, redundant, and carefully-maintained data collection equipment. https://www.ncdc.noaa.gov/crn/. The network consists of 114 stations distributed throughout the contiguous United States. Seven sites have paired stations for providing local redundancy; the remaining 100 are single station sites.

The goal is to provide the best data possible for answering a question in the future: how does climate changed over the decades? For reliability, sites were carefully selected to be in open rural areas, likely to be free from human development and vegetation encroachment for many years. Photographs of the stations can be found at www.ncdc .noaa.gov/crn/photos.html.

Diamond, et al. (2013) describes the USCRN in a report after ten years of operation.

“USCRN has historically maintained a data record availability exceeding 99.8% annually since 2007; this equates to missing about 18 hourly observations of data per station per year: many of which occur during the AMVs” (Annual Maintenance Visit).

Some instruments have triple redundancy so that problems with individual units can be discovered by comparison and resolved quickly. Hourly data is transmitted from station via GOES satellites to the National Climate Data Center (NCDC) for processing and eventually public release.

Paired stations

Early data from five of the seven paired sites was examined by Gallo (2005) for station representativeness. He found significant differences

“between the paired stations in the annual minimum, maximum, and mean temperatures for all five pairs of stations” and concluded that “results suggest that microclimate influences on temperatures observed at nearby … stations are potentially much greater than influences that might be due to latitude or elevation differences between the stations.”

One paired set Gallo did not look at is in Kingston, Rhode Island. The two stations stand 1.4 km apart with only about a 3-meter difference in elevation. Plains Road is situated closer to developed land on the campus of the University of Rhode Island — mostly athletic facilities and sports fields — near small agricultural test plots and turf fields. The installation was placed about 50 meters north of an existing USHCN station placed at this location in 1971, no doubt to run parallel for cross-checking. Peckham Farm lies to the south in open fields surrounded by woodlands. Both stations would have rated a 1 or 2 on the CRN siting criteria scale at installation.

As the Surface Stations project volunteer who documented the old USHCN station, I was curious about what the new instrumentation was recording; but more than that, concerned that development might be encroaching on a “pristine” site. Over the years, the University has expanded development in the direction of the Plains Road site as the following sequence of dated Google Earth aerial images illustrates.

Plains road on 04/04/2001 before installation of the USCRN station. The old USHCN station is located near the buildings slightly left of center. Note the otherwise open fields around the site.

The recently installed Plains Road station is indicated by the yellow arrow in this image from 04/29/2003. From the air temperature sensor array it is 55 meters west to a dirt road, 48 meters southeast to a greenhouse, 95 meters southwest to nearest pavement (small parking area), and 145 meters south to nearest paved road. Note the recent construction of large parking lots 212 meters to the east in what was open fields.

By 07/28/2007 the parking lot had been extended to within 182 meters of the station.

By 04/02/2012 three new greenhouses had been constructed only 23 meters west of the instrument arrays.

Even more construction is evident in this image from 09/11/2014. A new road comes within 68 meters and another parking lot extension lies only 93 meters away. The lone greenhouse was removed to make way for the road, but the other three remain.

In contrast, Peckham Farm has remained unchanged from installation as represented in this image from 09/11/2014. There is a narrow dirt access road about 20 meters away on the west side. Distance to the trees is 126 meters east.

Looking at the data

The primary criteria for siting USCRN stations are 1) spacial representativeness of regional climate, and 2) temporal stability with the likelihood the site will not experience major change, especially from human encroachment. Peckham Farm fits these criteria quite well, but Plains Road fails on the second point. Documented encroachment of a type known to affect temperature measurement (paved surfaces) has occurred already.

Does this mean Plains Road should be moved? Not necessarily; there may be a benefit in disguise. The situation now provides an excellent opportunity to evaluate the effects of encroachment. After sixteen years of data collection we probably have enough information for a reasonable evaluation.

To its credit NCDC makes the data for all USCRN stations publicly available at its website https://www.ncdc.noaa.gov/crn/qcdatasets.html so I downloaded files for both stations from January 1, 2002 through December 31, 2017. (see spreadsheet below as well)

Data is collected every hour as temperatures rise and fall throughout the day. For this analysis I’m looking at the monthly averages of daily mean, maximum, and minimum temperatures. The README.TXT file says mean air temperature in degrees C is calculated using the traditional formula of (T_MONTHLY_MAX + T_MONTHLY_MIN) / 2. According to standards,

“Monthly maximum/minimum/average temperatures are the average of all available daily max/min/averages. To be considered valid, there must be fewer than 4 consecutive daily values missing, and no more than 5 total [daily] values missing.”

Because readings are taken hourly there is no concern about time-of-observation adjustments.

I calculated the simple difference between stations (Plains minus Peckham) and plotted it against time, omitting months with either station having a missing value. A plot of the mean temperature difference between sites is quite revealing. First, the vertical axis shows that overall Plains Road was a bit warmer than Peckham Farm as indicated by most values being on the positive side of zero. This might be expected because of proximity to artificial structures at Plains Road.

The next thing to observe is that data from the initial three years (2002-2005) is problematic. It’s quite spotty and erratic compared to what comes later. 2005 is especially strange. Closer examination of the daily and hourly readings reveals some clearly erroneous numbers where the sensors became unstable. Why they remain in the NCDC data sets is unclear, but fortunately, we can exclude anything before 2006 without affecting the rest of the analysis.

Monthly mean temperature differences from 2006 through part of 2013 (blue) behave better. Most lie in a range of about 0.5 degrees C. There is some suggestion of an annual cycle but not too much should be made of it. The dotted blue line marks the average for the period and indicates that Plains Road was about 0.13 degrees warmer than Peckham Farm. Note that this is not a trend line which would be sensitive to endpoints; it’s merely the average over the eight years.

About May 2013 the data (red) suddenly step up and consistently stay more positive by 0.2 degrees over the previous period. Plains Road experienced a warming effect that Peckham Farm did not. The red dotted line marks the average of value (0.34 degrees) of the latest period.

What happened to cause the jump? The prime candidate is a road and parking lot extension shown in the 9/11/2014 aerial image. It appears that this occurred in summer 2013, dramatically altering microsite conditions. Earlier parking lot construction doesn’t seem to have produced discernable changes in the data, probably because of distance beyond the 100 m siting standard. The newest lot and road now lie within that boundary and are causing an effect.

Just how is the encroachment influencing temperatures? There is information to answer that question. A plot of differences in the monthly average maximum temperatures shows no difference before and after the May 2013 step jump that shows in the mean temperature plot. After excluding the suspect data before 2006, neither site is warmer than the other.

Not so with minimum monthly temperatures where Plains Road measures about 0.35 degrees warmer in the latest period. Since minimums usually occur at night, it’s reasonable to suspect that the encroaching blacktop at Plains Road is retaining the day’s heat into the evenings and artificially raising the numbers. This finding is consistent with observations at so many of the USHCN sites.

Conclusion

In spite of painstaking efforts to site USCRN stations to be representative of regional climatic conditions with small chance of contamination by local factors either at construction time or in the future, encroachment is always a potential threat. The paired stations at Kingston, RI, provide a clear example of even best intentions being confounded.

Yet it would be unfair to dismiss the Kingston USCRN paired stations as a failure. Actually, the site provides a unique opportunity to investigate in detail the effects of encroachment. Peckham Farm still is unaffected by development and can serve as the source of regional data as intended. Plains Road can be monitored carefully and analyzed using daily and hourly data to understand better the subtle effects of local contamination.

NCDC would do well to conduct a thorough evaluation and document what has taken place. The possibility of encroachment at the 106 other sites remains very real. Vigilance is required to keep the USCRN from suffering the kinds of degradation that has occurred at many USHCN sites. A compromised data set will not serve the purpose for which the USCRN was built – the recording of accurate and reliable information for necessary for climate research.

63 thoughts on “Infrastructure encroachment at a NOAA Climate Reference (USCRN) site makes the data warmer”

Those Rhode Island sites are a reasonable way to quantify siting and UHI effects. The interesting thing is that the magnitude of the effect of siting is right about the purported climate change one is trying to measure.

If one assumes that atmospheric CO2 affects ground level temperature, wouldn’t a first test be atmospheric temperature at many levels as seen by satellite or balloon measurements? Seems those would have fewer confounding influences like those described. Then one would think that the influence of CO2 at ground level would be limited by atmospheric temperature changes. If temperature changes at ground level significantly exceed those in the atmosphere, that would indicate another source of the difference. Or, is this just naïve?

According to GH theory as applied to the climate models, the resulting outputs predict a stronger warming in the mid-troposphere. A warming that reaches down to ground-level by a slowed loss of heat thus raising surface temps aswell.

A distortion of the atmospheric temperature profile as predicted by the infrared cooling models used in climate models should be measurable by analysis of balloon data. Has this been done? Yes. What was the result? The result was that no distortion of the atmospheric temperature profile has been found. https://globalwarmingsolved.com/2013/11/summary-the-physics-of-the-earths-atmosphere-papers-1-3/
“… we should be finding a very complex temperature profile, which is strongly dependent on the infrared-active gases. Instead, we found the temperature profile was completely independent of the infrared-active gases.”
There is no greenhouse effect. It’s a fiction that has never been tested and an elaborate fantasy has been spun out of a series of baseless assumptions. How? Because no one bothered to do physical experiments to test any of the assumptions, and to this day no one is bothering to actually test what we “know” to be true. So what do we really know?
Fallback position: radiative physics and MODTRAN are telling us something. But we take this data and then make assumptions from it, and still we refuse to test those assumptions except through even more modelling. Is what we assume to be true really true? How would we know? What is the scientific method? Are we interested in virtual reality or in physical reality?

”A compromised data set will not serve the purpose for which the USCRN was built – the recording of accurate and reliable information for necessary for climate research.”

Au contraire Messieur Boden. It serves the real purpose of alarmism quite well.
The informed alarmists understand human nature… that is that every generation wants to believe they are exceptional by living in exceptional times.
CRN – a network upon which otherwise dubious homogenizations cannot be applied…So…
an insidious UHI-contaminated CRN serves that purpose

Its hard to believe that this is not some kind of instrument aberration – its so sudden and constant.
If the data is hourly, and the hypothesis is the road/blacktop is causing the difference, then one could plot sunset to sunset+6 hours, and then sunset+6 hours to sunrise and compare there deltas. If the blacktop is the culprit, the prediction is the effect is higher after sunset (the blacktop should radiate most heat the first half of the night) and less pronounced before sunrise.
This analysis either supports your hypothesis or it does not. If the delta is more or less constant, then something else is to blame.

One could simply consult construction records at the county claerk if records (permits pulled by the contractor) to ascertain actual parking lot and Greenhouse construction dates to see the direct effect on diurnal, seasonal, and annual averages.

Robert, hourly data are available for download. Take a shot at it. There’s more that can be gleaned. I think instrumentation was a spotty problem in the years before 2006 that now has been cleaned up. Temperatures are measured by three instruments and compared for anomalies. Because of this redundancy and frequent scheduled maintenance, the numbers should be reliable.

Gary – think you need to nail down the exact date of the parking lot expansion, or other works at the time. Should be easy from the records or, if not, from ex URI alumni. If they coincide, the T-min graph is impressive. If not, further research needed. But a nice presentation – thanks

Mothcatcher, since this is monthly average data and it takes a couple of years to be sure of a step change and the parking lot is permanent, I didn’t think it was necessary to pin down the exact construction date to make the point.

Has Gary somehow missed the fact that the data were analyzed by NOAA in light of the surfacestation work, or neglected to mention it because of its effect on his argument? Somehow I don’t think “essential data was suspect” is the reason NOAA has given for new stations – that is the author’s assumption.https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/2009JD013094
Abstract:
“Recent photographic documentation of poor siting conditions at stations in the U.S.
Historical Climatology Network (USHCN) has led to questions regarding the reliability
of surface temperature trends over the conterminous United States (CONUS). To evaluate
the potential impact of poor siting/instrument exposure on CONUS temperatures, trends
derived from poor and well sited USHCN stations were compared. Results indicate that there
is a mean bias associated with poor exposure sites relative to good exposure sites; however,
this bias is consistent with previously documented changes associated with the widespread
conversion to electronic sensors in the USHCN during the last 25 years. Moreover, the
sign of the bias is counterintuitive to photographic documentation of poor exposure because
associated instrument changes have led to an artificial negative (“cool”) bias in maximum
temperatures and only a slight positive (“warm”) bias in minimum temperatures. These
results underscore the need to consider all changes in observation practice when determining
the impacts of siting irregularities. Further, the influence of nonstandard siting on
temperature trends can only be quantified through an analysis of the data. Adjustments
applied to USHCN Version 2 data largely account for the impact of instrument and siting
changes, although a small overall residual negative (“cool”) bias appears to remain in the
adjusted maximum temperature series. Nevertheless, the adjusted USHCN temperatures
are extremely well aligned with recent measurements from instruments whose exposure
characteristics meet the highest standards for climate monitoring. In summary, we find no
evidence that the CONUS average temperature trends are inflated due to poor station siting.”

My point is that he completely failed to mention it, when it is important to the big picture of data quality. It showed that that NOAA does respond to “alternative” data sources, is concerned about the quality of data, and their analysis contradicted some of the conclusions made in the Fall paper. Their emphasis on the importance of data analysis rather than simple arithmetic and descriptive visual aids is also noteworthy. If NOAA hasn’t published anything about the more recent data, that does not mean they aren’t aware of differences like those in the graphs.
I’m not saying it’s wrong to bring such differences to others’ attention, but when doing so it behooves a scientist (or a layman writing about science) to discuss the relevant literature adequately and to do a proper analysis of multiple sites if one wants to make any generalizations Anomalies are more often used for analyses of trends, and it would be interesting to see whether that changed the picture any.

No. Plotting data, drawing lines and making assumptions about cause and effect is not enough to make any generalizations about nonstandard siting.
“If the government says the data is good, then it is good. ” I disagree.

“The best-laid plans of mice and men often go awry.” Robert Burns
Time and entropy humble human hubris, it seems!
USCRN was created to resolve the historic problems identified with USHCN.
Yet it appears that the same failures are creeping into the ‘new’ reference sites.
Which brings us to “Those who cannot remember the past are condemned to repeat it”. George Santayana

It will be great to document more such pairwise comparisons where the metadata are good.
For me, a very interesting point is, do we have observations of similar quality where a change of siting/equipment/surroundings produced a DECREASE in temperature? Would love to study some, but have not found any yet. Geoff

Great work by Gary.
This analysis shows the importance of metadata (in this case aerial photographs) in assessing UHI encroachment. Too often people grab low hanging fruit like population data to try and quantify UHI, but nothing beats metadata for the real site conditions.

Thanks, Wolf. I’ll take NOAA at its word that its trying to run a network that we can have some confidence in. Maintenance records and metadata are part of the responsibility. This analysis shows that even the best intentions can be affected by external influences so its important to remain vigilant.

“the best intentions”
————–
NOAA has been run by certified climate alarmist hacks, from the top Jane Lubchenco to the bottom “oceans acidification” scammers en passant the middle like Thomas Karl. Assuming the best intentions from them is quite a big gaping stretch.
The USCRN exists only by accident and NOAA only inherited it, in the sense inheriting a thorn in the foot. The NOAA under “seas will stop rising under my presidency” Obama would have never created the USCRN and its inconvenient data, it largely prefers doing “science” with the USHCN’s rotten and tortured data.

1,200 station sounds lot but then the land mass of the USA is 9.834 million km² so these stations have to cover a great deal of ground each and how well can that really be done ?
Let us be honest this is not a scientifically valid data collection method, for it simply cannot provide the range required to cover what it claims to measure, it is has its always been a ‘better than nothing ‘ data collection method.

One element of the scientific method is experimental design. In turn this design, needs to have parameters that justify the claims being made from the result of the experiment for features such has accuracy , range etc .
If it not possible to offer this you are not measuring you are ‘guessing ‘ .
In which case you should not make great claims of ‘settled science ‘ and certainty in your results but instead admit you looking at theories , which are certainly subject to challenge and that your data is ‘better than nothing ‘ but not the quality of data you should have.
The problem here is an old one, that data collection does no match the claims being made upon it.
Indeed in the days before ‘settle science ‘ this was effectively acknowledge in the unreliability of weather forecast .
Now we have claims of two decimal place accuracy for 100 years ahead and more behind , but the quality of the data has not improved rather ‘models ‘ are supposed to cover this issue . However, these same models are often classic cases of GIGO with no better record of forecasting than other approaches which tell us in the summer it is warmer than in winter.

knr- your comment is 100% correct and can be broken down even more. For example, the only site in North Carolina is in the northern Piedmont. NC has distinct climate zones with eastern NC being heavily influenced by the Atlantic and Gulf Stream, especially in the winter. No site there. The southern Piedmont and mountains also are different. Not sites there.
And this applies to a lot of the US, especially around large bodies of water or mountains. Take a boat ride from the San Francisco ferry port to Sausalito at noon in the summer. You go from 70F to 55F to 80F in a very short trip.
So to get even close you want a sample site giving you representative data for each microclimate zone. Many a winter day has it cloudy and 45F in the northern Piedmont and 68F and sunny in Wilmington.

But remember, temperature trends (the important issue here) are not obtained by averaging all reporting stations. Rather, the temperature change in each station is noted, then an average taken. So where the station is located is of lesser importance.

it would be interesting to see the differences plotted against maximum temperature to see if the variability in the differences might be explained by that, and also against things like snow cover – were there days when the tarmac was white and did they correlate with days when the difference was small?
The average is interesting, but there is a great deal of variability in the record of differences and I would want to see that explained rather than the average.

With the advent of solar cells, rechargeable batteries and satellite internet connectivity there is no reason to have stations in developed areas. There should be tens of thousands of new stations situated in remote areas. the problem is that climate scientists would rather have “Adjustable” data to offer as proof of their agenda.

Interesting. Warming of the climate due to an urban effect is clearly seen in the data; due to CO2 – not so much. So how do 97 % of scientists handle this evidence and these facts? Let’s start hearing more about a pavement effect and less about a greenhouse effect. It’s not an unknown phenomenon that dry sidewalk exposed to direct sunlight will be substantially hotter than adjacent moist lawn.

Outstanding idea, Hank!!
“Let’s start hearing more about a pavement effect”
Ask anyone here older than 60 or 70 if they remember those 40’s and 50’s. Less concrete/asphalt and more open spaces for the breeze to flow and grass to grow. It was cooler, especially at night This environment also had the benefit of absorbing water when it rained, so less flooding and where I grew up in southern Louisiana the subsidence was less than during the last 30 or 40 years, regardless of new levees and channels to carry off the water ( and the soil that used to accululate in the marshes!!).
Believe it or not, but most homes on my unpaved street in.New Orleans did not have whole house air conditioning. Some had window units for the bedroom, and we did not get one until I got my scholarship and moved out, thereby saving my folks a lotta $$$. By the mid sixties most folks had decent HVAC.
Ditto for Colorado where I went to school in the 60’s. Air conditioning? What’s that? And then they paved over everything and built houses too close together. So by mid-to-late 90’s air conditioning became commonplace.
So the pavement effect is real and should be taken into account, ya think?
Gums sends…

MarkW, I have not “handled” anything.
The existence of cool biases are not “declared” they are discovered using statistical techniques beyond the scope of any I’ve seen here when it comes to temperature data analysis.
“That and they have tuned their models to account for it.” I see. How so? Perhaps you’d like to describe the exact process “they” have used to tune their surface temperature modeling?
[?? .mod]

When I started following AGW back in the late 1970s, early 1980s I leaned toward being skeptical but believed that it was possible. I did have an above average knowledge of climate history and knew that the earth had been generally warming since the last glaciation. I knew the question early on was whether humans were having any influence. Then after discussion with several folks from the NWS on the accuracy of their forecast, something in which a marine scientists going to sea has a personal interest, one mentioned as an aside and with a wink and a nod that I should look at surface weather stations just in Florida. When I did I noticed many of the conditions Anthony has documented and this essay recounts. I knew of two stations that had been moved with no note in the data saying they had. I knew of several that had been in fields that were now surrounded by buildings and pavement. I also question whether when they changed thermometers at a individual station what if any verifications were made relative to precisions and accuracy. The answers I got were not reassuring. I now live in an area, dominated my microclimate due to terrain, vegetation, wetlands, and construction, where the low temperature at my house in a 24 hour period is three to five degrees different from the official surface station. Then local surface station was moved from a low area to a hill right next to the airport. Yet again I could find no actually notation as to the move in the data. So if you were not paying attention to the local news and caught the day it made the news looking at the data you would never have known. Somewhere here in the comments someone infers that NOAA should get credit for at least trying to fix the problem. Sorry but repeating the same original mistakes or failing to prevent those and allowing new mistakes just won’t get it, most especially some forty years after NWS staff recognized there were problems. I was raised not to trust government. I then spent 30 years in and around government where I learned first hand not to trust government.

Station are good for “weather”(short term events) and misused for “climate” (long term). Error bars and heat effect of +\- 0.2deg makes the data almost useless in trying to detect a “change”. Adding that to unproven “models” with their own presumption errors makes the enterprise of gauging “climate change” on a 0.2deg /Decade scale- a erroneous endeavor. Garbage in to equivocal models yields garbage out. Toys of the progressive pseudo science movement to drive a political agenda.

In reviewing the GHCN daily max and daily min data for individual site history I saw many instances of a mean shift exactly as described. In those cases where I had accurate site coordinates and was able to pull up a satellite view of the site you could easily see encroaching development or potential changes in vegetation in all but a few. One of the things that stood out on several sites was tall buildings, added walls, and so forth. Anything that would block wind.
However, one site I reviewed had a pronounced shift towards cooler temperatures. This particular site is situated close to a river next to a large amount of suburban development. I have no idea if this was the original location or not. However, the current location is close to a river, there are dense areas of trees by the river nearby, but most few trees remain between the site and the river. On the opposite side of the river there is no development and a lot of tree covered areas.
As I saw in the GHCN data sites can be compromised to show both warmer and cooler temperatures. The conditions to induce cooling are relatively rare, but they do exist.

It is sad that with so many quality weather monitoring networks within the Federal System that NCDC can’t do a better job of instrument selection, and sighting. Consider the climate monitoring networks run by NOAA>ESRL>Global Monitoring Division, NOAA>ARL>Surface Energy Budget Measurements, or The US Dept. of Transportation>Road Weather Management Program for that matter, all seem to have a better grasp of the requirements for representative measurements than NCDC. And, in addition, by not measuring wind direction the NCDC data set is of significantly less value than those programs that do. Is maintaining a wind vane and computing a vector mean wind direction in real time that hard?

It would be interesting to compare official sites vs stations installed and maintained by a company like DTN which offers weather stations and networks to farmers. Usually these are sited in or near fields as the farmers want to know conditions in their fields. There are also over 6000 stations in the midwest US and central area of Canada so there is a good amount of data being gathered.

Nice work Gary Boden. The USCRN is definitely a step in the right direction, but not perfect. Ideally all CRN sites should be paired with nearby sites, for the very reason you have demonstrated. A CRN site was established in a rural area northwest of Austin, Texas not far from where I live. It’s on a flat high hill top. Ideally there should be a second CRN in the valley about 200-300 feet below. There are some weather stations in the valley reporting to the Weather Underground that show quite a bit of difference largely because of terrain effects.
Sorry I’m late to the game, but I have been looking at the Mauna Loa site in Hawaii, where a USCRN is located close to a USHCN and also an NCAR site. There are puzzling changes over time. The NCAR measurements show an overall rising temperature in recent years compared to the USHCN measurements and the USCRN falls in between.
Here is a graph of the annual data:https://oz4caster.files.wordpress.com/2018/02/ml1-tmp-ann-all-sites.gif
Here is a graph of the differences in the monthly data between the three sites:https://oz4caster.files.wordpress.com/2018/02/ml10-tmpdif-mo-1977-2017.gif
More graphs here, including Mt Washington as well:https://oz4caster.wordpress.com/mlmw/

We use cookies to ensure that we give you the best experience on WUWT. If you continue to use this site we will assume that you are happy with it. This notice is required by recently enacted EU GDPR rules, and since WUWT is a globally read website, we need to keep the bureaucrats off our case!OkPrivacy policy