A call for an improved global climate measurement system

From the UNIVERSITY OF COLORADO AT BOULDER and the “weather stations near heat sources aren’t measuring climate” department, something that we already have in the USA in the form of the state-of-the-art Climate Reference Network, CRN, but we don’t have globally. This call for a better system is something I agree with, because much of the existing monitoring network has problems like you see in the photo below, among other problems.

Photo by Warren Meyer. The official USHCN weather station at the University of Arizona, Tucson, measuring temperature in the middle of a parking lot. The station’s data had been used as part of the global climate database, but was closed by NOAA after being exposed by this website as ludicrously located.

Improving climate observations offers major return on investment

A well-designed climate observing system could deliver trillions of dollars in economic benefits.

A well-designed climate observing system could help scientists answer knotty questions about climate while delivering trillions of dollars in benefits by providing decision makers information they need to protect public health and the economy in the coming decades, according to a new paper published today.

The flip side is also true, said lead author Elizabeth Weatherhead, a scientist with CIRES at the University of Colorado Boulder. The cost of failing to invest in improving our ability to predict and plan for droughts, floods, extreme heat events, famine, sea level rise and changes in freshwater availability could reach hundreds of billions of dollars each year, she and her colleagues wrote. Their paper is published in the current edition of Earth’s Future, an online journal of the American Geophysical Union.

“Improving our understanding of climate not only offers large societal benefits but also significant economic returns,” Weatherhead said. “We’re not specifying which measurement (or observing) systems to target, we’re simply saying it’s a smart investment to address the most pressing societal needs.”

Data generated by the current assemblage of observing systems, including NOAA’s satellite and ground-based observing systems, have yielded significant insights into important climate questions. However, coordinated development and expansion of climate observing systems are required to advance weather and climate prediction to address the scale of risks likely in the future.

For instance, the current observing system cannot monitor precipitation extremes throughout much of the world, and cannot forecast the likelihood of extreme flooding well enough to sufficiently guide rebuilding efforts. “The current decline of our Earth observing systems is likely to continue into the foreseeable future,” said Liz Moyer, a climate researcher at the University of Chicago who was not involved in the new assessment. “Unless action is taken–such as suggested in this paper–our ability to plan for and respond to some of the most important aspects of climate, including extreme events and water availability, will be significantly limited.”

Weatherhead and a team that included four NOAA laboratory directors and many other prominent climate scientists urge that investments focus on tackling seven “grand challenges,” such as predicting extreme weather and climate shifts, the role of clouds and circulation in regulating climate, the regional sea level change and coastal impacts, understanding the consequences melting ice, and feedback loops involving carbon cycling. In each category, observations are needed to inform process studies, to build long-term datasets against which to evaluate changing conditions,and ultimately to improve modeling and forecasting capabilities.

“We are on the threshold of a new era in prediction, drawing on our knowledge of the entire Earth system to strengthen societal resilience to potential climate and weather disasters,” said co-author Antonio Busalacchi, president of the University Corporation for Atmospheric Research. “Strategic investments in observing technologies will pay for themselves many times over by protecting life and property, promoting economic growth, and providing needed intelligence to decision makers.”

“Well planned observations are important to more than just understanding climate,” agreed Deon Terblanche, director of research at the World Meteorological Organization. “Predicting the weather and extreme events, and managing water availability and energy demand will all benefit,”

“Developing observation systems focused on the major scientific questions with a rigorous evaluation process to ensure the measurement quality is fit-for-purpose–as the authors propose–will more than pay off in the long run,” said Tom Gardiner, a principal research scientist at the UK’s National Physical Laboratory.

Objective evaluations of proposed observing systems, including satellites, ground-based or in-situ observations as well as new, currently unidentified observational approaches, will be needed to prioritize investments and maximize societal benefits, the authors propose.

“We need to take a critical look at what’s needed to address the most important climate questions,” said NASA scientist and co-author Bruce Wielicki.

Not all new observing strategies would necessarily require expensive new systems like satellites, the authors pointed out. For example, after a devastating flood hit Fort Collins, Colo. in 1998, the state climatologist developed a network of trained volunteers to supplement official National Weather Service precipitation measurements using low-cost measuring tools and a dedicated web portal. The Community Collaborative Rain, Hail and Snow now counts thousands of volunteers nationwide who provide the data directly to the National Weather Service.

Using a rigorous evaluation process to develop a robust network of observation systems focused on the major scientific questions will more than pay off in the long run, the authors concluded.

“The economic risks from climate change are measured in trillions of dollars,” agreed Rich Sorkin, CEO of Jupiter, a Silicon Valley-based company that provides intelligence on weather and climate risks around the globe. “So an improved, properly designed observing system, with commensurate investments in science and understanding, has the potential to be of tremendous value to society.”

Climate observations are needed to address a large range of important societal issues including sea level rise, droughts, floods, extreme heat events, food security, and fresh water availability in the coming decades. Past, targeted investments in specific climate questions have resulted in tremendous improvements in issues important to human health, security, and infrastructure. However, the current climate observing system was not planned in a comprehensive, focused manner required to adequately address the full range of climate needs. A potential approach to planning the observing system of the future is presented in this paper. First, this paper proposes that priority be given to the most critical needs as identified within the World Climate Research Program as Grand Challenges. These currently include seven important topics: Melting Ice and Global Consequences; Clouds, Circulation and Climate Sensitivity; Carbon Feedbacks in the Climate System; Understanding and Predicting Weather and Climate Extremes; Water for the Food Baskets of the World; Regional Sea-Level Change and Coastal Impacts; and Near-term Climate Prediction. For each Grand Challenge, observations are needed for long-term monitoring, process studies and forecasting capabilities. Second, objective evaluations of proposed observing systems, including satellites, ground-based and in situ observations as well as potentially new, unidentified observational approaches, can quantify the ability to address these climate priorities. And third, investments in effective climate observations will be economically important as they will offer a magnified return on investment that justifies a far greater development of observations to serve society’s needs.

We have ample evidence that measuring climate in California and Arizona cannot be trusted and climate research from Manhattan and Paris is also not to be trusted. I suggest autonomous independent sources like satellites and automated ocean buoys with full coverage. Humans cannot be trusted with this work. It’s too important.

[Actually, BEST is not independent at all they use the same GHCN data as NOAA and NASA. Same base ingredients, just a different sauce recipe. They are also funded by the Koch Brothers. BEST received $150,000 from the Charles G. Koch Charitable Foundation. -mod]

Anyone who has every lived in the downtown of a big city (I grew up in Center City Philadelphia) knows very well about the UHI, even if they never knew it had a name.
It is simply very clear that it is hotter in the city that outside of town.
The difference is especially notable during two sorts of weather conditions: When it is Summer, especially during very hot spells with light winds; and during cold nights.
Heat wave Summer days are far hotter in the middle of an urban area than even the adjoining suburbs, and people flock out of large cities in Summer as a result, if they possibly can. And at night during these periods, the lack of nighttime cooling is incredible…incredibly bad, especially for anyone who does not have air conditioning.
The hotter it is the bigger the difference seems to be, but it is ever-present, although breezy to windy conditions lessen the effect.
The less cold nights are welcome during colder parts of the year.
So strong is the effect on nighttime cooling, that many sorts of plants are well known to grown several zones north of their rural cold hardiness limit, if they are within the concrete jungle.
There are some famous examples.

UHI is very real. I live on a farm 7 miles from the nearest radio station which is on the outskirts of a small city (population ~500,000). I have a weather station out in my field and every morning, there is a 3 to 10 F difference between here and there with here always being cooler. My weather station lines up with my car thermometer at the house and as I pass the radio station, my car thermometer lines up with the station. So the difference is very real. The morning lows are even warmer in the downtown area. The radio station is about 10 miles from the downtown. It isn’t an elevation thing, as I am 100 ft above the radio station and 250 ft above the down town. I know the lapse rate doesn’t add that much.

Anything involving adjusted data (and data is no longer truly data once it has been adjusted) is potentially unreliable.

BEST was a great missed opportunity, to take a new approach to the assessment of temperature rather than using the same approach, the same data but simply with a different massaging algorithm.

They ought to have audited every station and selected only the very best sited stations with the best practices of procedures and the most intact record, at least a complete record covering the early 1930s to mid 1940s, and these stations ought to have been retrofitted with the same type of LIG thermometers (and same type of enclosures painted with the same type of paint) as used in the past, calibrated in the same way as was done in the past, and then they ought to have observed temperatures using the same practice and procedure as used at that station (including the same station TOB), and obtained modern day RAW data that could be directly compared, on a station by station data, with historic RAW data at the station in question with no adjustments whatsoever.

If they had done this we would now have around 5 years worth of RAW data that could be directly compared to the historic highs of the 1930s/early 1940s and we would have a much better indication as to whether there has been any warming since the highs of the 1930s/early 1940s, covering the period when some 95% of all manmade CO2 emissions have taken place.

Instead we have no idea whether today the globe is any warmer than the 1930s/1940s or for that matter around 1990, although it is almost certainly the case that the US is no warmer than it was in around the mid 1930s. We can be fairly cetain of that if one reviews the USHCN data of which there are around 750 stations with extant records covering the period 1930 to data.

I am for a properly sited, and modern weather and climate systems and data systems. However, we are throwing good money after bad if the resulting data and not fully available to the public, all operations such as homogenization and “hiding the bump”, etc are prevented by adequate external monitoring, and databases for raw data, corrected data, etc are permanently archived, and 100% transparent and fully available to the public. New laws and rules related to data tampering, destruction, falsification, etc. ought to be felonies and fully monitored and reported.
I know this will be cumbersome and expensive, but nefariously altered data are worse than no data at all.
Moreover the design specifications, error properties, data collection, storage, public access, monitoring, legal restrictions, penalties for illegal modification of the data for political purposes, and procedures for monitoring of those collecting and database, etc. should be developed and presented for public review and correction well before a single dollar is spent for equipment or new scientific salaries.
The world’s history of weather and climate data are invaluable, but alas these databases have been corrupted, raw data destroyed, and data processing and analyses techniques have been done in secrecy. We must never make the same mistakes of allowing incompetents and crooks and and politicians to repeat their corruption of future data. There should be stringent QA & QC procedures specified and criminal activities and penalties specified before any work is done.
We know how to do all these things and we shouldn’t allow any future work to be done without them. Never again should we let highly politicized and leftist politics edit/delete/corrupt anything as important as our weather and climate databases.

I was speaking of US data, sorry I did not say that specifically. If we need data around the world for our national security (military, economic, etc.) then that should be included using remotely sensed data collection systems developed and validated in the US.

There should also be ONE government agency whose only job is obtain and record actual data. They should be responsible for quality assurance procedures to identify problem stations whose bad data is then removed entirely, i.e. no adjustments. Adjustments should be left up to others who can use any techniques they wish to “infill” missing data. If the government wishes to have an agency homogenize the data for use by scientists who do not wish to do the work themselves, then any adjustments must follow published algorithms with all assumptions available.

Crosspatch,
The problem is the argument of the US only being 2% of the global surface. We need a CRN on every continent and ARGO floats in every sea. Then we need to maintain them for 60 years to see if anything can be said about the whole CO2 controversy. The current network is not fit for climate monitoring but is fine for weather forecasting over the next week or so.

Correct, we do not have a global climate reference network but a decent approximation can be made by selecting very specific existing stations, upgrading the measurement equipment if needed, and using those data. For example, rural stations with little local land use change and not likely to see any with modern gear would be a close approximation. At least it would be closer that what we have now where we have a mixture of good and bad stations and things are adjusted every which way in order to try to correct for local biases.

And those stations should also be retrofitted with the same type of LIG thermometers as used at the station in the 1930s/1940s and we should observe using the same historic practice and procedure as used at the station (including TOB) and we could then obtain modern day RAW data that could be compared directly with the station’s own historic RAW data with no adjustments being made whatsoever to the data, ie., modern day RAW with historic RAW, on a station by station basis without seeking to compile a global or hemisphere wide set.

_Jim
Yes.
And that is b e f o r e it gets ‘adjusted’, ‘homogenized’, or otherwise altered to meet the current set of – politely – (mere) lies, from our Adjustocene fraudsters.

(‘mere’ – I have many years in shipping and could reasonably [and accurately] use other pejorative and impolite terms, many of which would be – probably rightly – rejected by Mods.]

What else is it, other than fraud?

There does need to be a reckoning – including papers revoked, status and tenure denied, even, in some [possibly rather few, given the standards of evidence needed . . . . .] cases, criminal prosecution for fraud.
If, on conviction, serious jail terms were handed down – and upheld on appeal – a lesson would be available for the politicians and the scientists [sensu lato!].

I argue that it is fit for purpose. It was designed for weather monitoring of key areas, and it provides sufficient input to predict the weather for the next 5 days with sufficient detail to plan activities. Now if you mean not fit for the purpose of climate monitoring that it was in no way designed for, I agree wholeheartedly.

You are right – if the world is to improve its weather monitoring, then the world should pay for it. However, it would be valuable it, before doing so, the world would agree on a firm set of standards of what to measure, how to measure it (including placing measuring stations away from built-up areas), and how to process and report the data.

However, as the world is becoming more urban, finding “pristine” locations for weather reporting stations will be hard. Maybe we should get some independent researchers, without an agenda. to really study and define the UHI?

“this paper tacitly admits that weather data from around the world isn’t fit for purpose.”

You beat me to it.

“it would be valuable it, before doing so, the world would agree on a firm set of standards of what to measure, how to measure it (including placing measuring stations away from built-up areas), and how to process and report the data.”

Good luck with that.

“Maybe we should get some independent researchers, without an agenda. to really study and define the UHI?”

No such thing. If privately funded, by whom? Who can’t possibly derive any benefit from all the money sloshing around in climate research? If government funded?…….well we know the answer to that.

The whole thing would end up in an enormous global political punch up with everyone demanding even more money that they’re demanding now. Can you imagine the size of the conferences just to decide what instruments to use? American, Russian or Chinese.

Leave the whole thing as it is just now and let the chips fall where they may. The only way out this whole mess is to let the planet decide what temperature it’s going to be by the next century.

She calls for doing whatever is needed to be able to forecast droughts and floods and such, but such events may be unforecastable.
One thing we can do, and that is easy, is to use existing records of such events to inform us about what we are likely to see in the future.
And stop pretending that CO2 is the only reason climates change, and stop pretending that from now on, any and all changes will be not just bad, but absolutely catastrophic.
And that people can fine tune weather and climate by handing over power and money to lying politicians and make believe scientists.

Whilst I accept the need for standards, what is the difficulty in having all stations simply meeting CRN1 reference standard. Every station should use the same make and model of platinum resistance thermometer, and I would suggest that it is cased in material that has the same sort of thermal response and lag as the old LIG thermometers. We certainly should not try and make a global assessment with area weighting, infilling, Kriging etc, but simply compare each station with itself so that we know what has happened to temperature at point A, what has happened at point B, what has happened at point C etc.

It is important to bear in mind that we would be creating a new system with a completely new data set, and whilst there will be pressure to compare the new data with the past, that ought to be resisted.

This is what the UN – IPCC Green fund should be initiating and supporting.
The first bundle of cash should be used for a standardized single point of supply monitoring system.
Send it out and bolt it to the ground – auto data retrieval to a public accessible web location in real time.

Because climate is only a statistical construct of weather and the climate projections so far leaves plenty of room for improvement, in my opinion confidence building is necessary before further investment. The first target is to ensure local weather forecasts remain accurate within 24 hours over a year.

That may be the tacit point the paper makes, but it certainly isn’t the view of the-powers-that-be in the climate establishment. As far as they are concerned, current data is “close enough for government work.”

The people responsible for collecting the data and keeping the historical records alter it, ignore anything that does not support their narrow view, and use the information they do have to promote alarmism and deindustrialization.
There is no reason the think they would change their ways with more information.
Case in point: The programs to measure global temperatures via satellite. When the satellites showed warming, more than did ground based observations, they claimed that only the satellite data was reliable. Within a few years, as the pause began and then lengthened, they took the opposite approach, and now pretend it does not exist.
When the historical records did not back up the alarmist meme, and instead flatly contradicted it, they altered those historical records, and attempted a massive rewriting of history for good measure.
Oh, hey…did the author of this study just admit that climate scientists are stumbling around in the dark?
Sure sounds like it to me.
Maybe if they did real science they would have better information and be able to issue more reliable predictions.
There is a reason why the scientific method rejects beginning with a conclusion and throwing out any information that contradicts that conclusion.

“Maybe if they did real science they would have better information and be able to issue more reliable predictions.”

And therein lies the problem, too much science.

Science in the past was undertaken by studious, highly intelligent, imaginative, driven individuals with the objective of proving the orthodox perceptions wrong.

Modern science is swamped with scientific studies by monotonic, unimaginative, career scientists, motivated by demands from universities to produce research to generate funds and profit for the institution.

The world has moved from worshipping gods to worshipping science.

I don’t know which is worse, but I think the worshipping bit is a clue.

AW., a question from one who is “ignorant but curious”: This is just one station, where I suspect that the shortcomings of this particular Tuscon, AZ station will be obvious to just about everyone; however, regarding ALL of these others, is there some sort of comprehensive database [e.g., spreadsheet and maybe even a recent foto] which lists and “characterizes” each and every such station? IOW, is there some sort of searchable database which demonstrates, to one and all, at least the general conditions at each weather stations?

Follow-up notion: my apologies as I understand that this notion is hardly a new one … and am hoping that there is some online link which could be accessed to investigate this matter.

I don’t follow the whole paper, sure you can measure temperature better but the data goes nowhere because you can’t model it. The physics for the radiative transfer is nasty and can’t even properly be described by classical physics and noone has done any large scale quantum thermodynamic calcs because the data doesn’t exist. All your ever going to get under current physics is some really crass numbers and probably site specific. It seems to escape people that as the CO2 absorption is a quantum domain problem it’s really sensitive to quantum effects.

It highlights the problem you need to know the exact quantum interactions for CO2 and those processes are sensitive to many factors. What they should be doing now is some serious Quantum Physics because they have no hope of improving the models until the models reflect the actual physics.

That Univ of AZ Atmospheric Sciences WX station has been defunct for a number of years, at least 4 when I moved back to Tucson in 2013. I remember it was operational in 2004 and recording daily solar irradiance as well, but it wasn’t surrounded by buildings then. It still may be used for some temperature and humidity recordings for undergrad projects but nothing that goes into a climate record DB.

It looks more like a EE-sensor-hobby project with wires hanging out of that wooden box The state that it is in, I can’t think anyone uses the data it generates for anything other than testing sensor setups and/or learning data processing from raw outputs.

Doesn’t this just allow the sc@m to continue? They will need another 50 years of data collection to discern any trend. The wording of the article still assumes climate change is happening for the worse, extreme weather, etc. They are not saying this new equipment will help prove/disprove the AGW theory. Send us money, we want new toys.

If enough intrested citizens could be employed in the collection of the necessary data they would serve as cross checks to each other so that any who are tempted to fudge the data for “the cause” so to speak could be identified and eliminated. At as far as collecting the necessary data on land is concerned. Also if the collected data is made available to the public anyone within a particular area who consistently enters data which lies outside that which is reasonable would be questioned by their fellow citizens. It should be relatively inexpensive to come up with a package for distribution to anyone interested in participating.

D J Hawkins
Who, at this point in time, really cares what the powers that be think. It seems that research which is blowing holes in their ideas is being published and there does not seem to be anything they can do about it. I know that there are still individuals who can lose their jobs if they are not careful what they do or say yet the time is fast approaching where freedom of speech will once again be practiced in the scientific community.

People living in a community would be familiar with how temperatures varied with terrain and time of day therefore they would know whether a 5 or 10 degrees difference would be reasonable due to the specific conditions which they themselves observed or knew obtained.

In the 1960s part of my job was recording weather data, rainfall,max and min temperatures, soil temperature, sunshine hours etc. We had the weather station in a small fenced area in the middle of a grass field. The data was sent each month to the local water board who collated the information and circulated it to all the people recording it.
The weather stations were checked each year for accuracy.
Some of the local families had been recording weather data since the 1930s.

Stephen, I agree that people can collect quality data with accuracy or resolution. My point was that another weather station in a small fenced area in the middle of an adjacent grass field could read a temperature with the same accuracy or resolution, but most likely a different absolute number. So who’s station is “correct”, and how could one detect any observational bias from those values, as Crowcane suggested?

See the Citizen Weather Observer Program (CWOP) http://wxqa.com/ it is QA’d, for what it is worth. Downside, some stations are improperly sited, most stations are not checked/verified on a regular basis.

if you use stations locating in urban areas or airports all you are measuring is heat island effects not global temperatures. Australia looks like it is measuring random heat spikes because it is keeping changes that last only 1 second

Just curious. Does anyone have a ballpark figure for how much such a package would cost? There are ways of reducing the cost to those wishing to participate by getting interested organizations and individuals to contribute and purchasing packages in bulk.

As for access to the data goes all that would be required is far each individual collecting the data to collect it on their own computer and then transfer it to both the scientists and a public archive. Once you allow private citizens to have control of data just try to take it away from them and you will wish you had kept your mouth shut or hands to yourself or whatever analogy you wish to use. It would be a really bad idea to try to stop them from sharing.

A comprehensive global climate measurement system may sound impressive but it smells like just another omnibus wish list in search of a signed blank check. The report abstract lacks specifics. Comprehensive is synonymous with bloated. The questions of Why, What, Where and How Much are not addressed. Right now, there is little agreement on the physics of climate science. What should be the direction of research, what are the critical problems to solve and what measurements need to be improved to solve the problems. Answer these questions and then talk about funding and the expected benefits to society. The scattergun approach does not work for a complex, massive project,

accuweather.com had a December 30th, 2015 temperature for Kelowna BC that was way off, stating 26 celcius high which is like a hot summer day. December 29th was -1 celcius high, and December 31st was -6 celcius high. All the other cities nearby were about -6 celcius on December 30th.

I e-mailed accuweather countless times and had no response.
Not sure how to post a picture here but I snipped it out. I was able to get a 2015 date by going to Jan 2016 and it still shows the last days of 2015.

My take on this is the climatists at UC-Boulder and NCAR-Boulder have no effective rebuttal of what Tony Heller constantly shows on his web blog about how so much of NOAA’s global surface temperature is now just infilled from a climate model output (i.e. it’s made up and treated like it was a real measurement by mixing it with real observations, sort of a “Mann’s Nature Trick” on the global surface temp data sets).

Tony is constantly showing how in large parts of the world (like sub-Sahara Africa where warlords and despots reign) where they in-fill data, that synthesized data does not agree with the observational AMSU-satellite lower troposphere temp anomaly data.

Since they have no effective counter argument to Tony’s postings that their surface temps are infilled junk, they want a real data collection to manipulate.

This is a very worthy undertaking, and I would have at least three suggestions right out of the box. First, place several stations at points on the earth where none now exist, but where interpolated “data” are centered. This would test the validity of the dubious practice of considering interpolated data as the equivalent of measured data.

Second, make sure that the standard station measures both temperature and relative humidity. The theory of greenhouse effect global warming revolves around trapping energy in the atmosphere. The enthalpy of air isn’t just – or even primarily- a function only of temperature. In very humid air, the latent heat of vaporization in the water vapor is by far the bigger component of enthalpy.

Third, place at least three stations within a mile or so of one another. I’ll never forget one winter, standing on the train platform during my commute, and observing that the temperature displayed on the information board, the temperature reported by the weather app on my iPhone, and the temperature measured by by truck’s outside thermometer (which I’ve found to be good to within about 1 degree F) spanned a total of 6 degrees F. The data for each of the displays came from different weather stations in the same town. It would be great to get a time history of that on a large scale.

The reality is that the climate change we are experiencing is caused by the sun and the oceans over which Mankind has no control There is no real evidence that CO2 has any effect on climate and plenty scientific rational to support the idea that the climate sensivity of CO2 is zero. Better measurements are not going to change anything and are of no real value. If we think that all the old measurements are bad then they should be discarded and such bad measurements should stop along with funding that supports them. All previous climate research based upon the past date must be discarded. A new measurement system will have to be in place at least 30 years before we will obtain a measurement of climate change. So if we do establish a new measurement system then all funding for climate related activities should be stopped for at least 30 years while the initial climate change measurements are being made. The Paris Climate Agreement must be discarded because it is based upon bad data.

The big question is where is the money going to come from to pay for it. Surely not the USA. The USA has a huge federal debt., huge annual federal deficits, and huge annual trade deficits. The USA cannot even begin to pay for anything such as this until things are turned around economically and the USA has payed off its debts that will at best take many decades to accomplish. What the USA should do is stop all funding of climate research because it is all based upon bad data.

Can anyone give me some idea of how much the necessary instrumentation for a weather station would cost? I assume that the data collected from the weather watchers of a local TV station would not be up to your standards.

I did not check the specs, but accuracy of digital thermometers of +/- 0.1 C is routine. (converting that to Fahrenheit is left as an exercise for the alert reader) This is way more than needed for weather/climate work.

Turns out sensitive instruments are kind of picky…they do not perform well under extremely harsh conditions in the middle of nowhere, unless they are monitored and maintained…generally speaking. Beside the measurement instrumentation, there needs to be power and a way to transmit the data.
And, as noted, it would take about thirty years of daily readings from the same spot to establish a baseline, from which anomalies can be derived.
Of course, the data adjusters can always just work their hocus-pocus tomfoolery on any new station data…they have tons of practice.
In fact, making sure that all data conforms to their preconceived notions seems to be what they are best at.

More like 60 years. Have to get through a full cycle of AMO and PDO to see the full effect. 30 years just takes you from a peak to a valley or vice-versa. Ideally 120 years would be needed to check for Fourier components beyond the 60 year pseudo-cycle. We will all be dead by then and wouldn’t be the “Saviors of the Universe” if we did it right so that can’t be the way. It isn’t like people in the past planted trees that wouldn’t bear fruit in their lifetimes or something.

Besides, there is no need for a whole new network of stations to do what you describe…they could do it whenever they wanted…they have had carte blanche for years and years, and the $$$ has flowed freely.
Setting up stations in remote and uncomfortable locales requires travelling to those places…not on the list of favored activities for the climate elite.

If these badly sited temperature measuring Stations had been measuring excessively cold, the Meteorological Organisations would have soon been out there correcting them. However, they are measuring too hot and so are exactly what they want.

Let’s not forget (and give thanks to) Steve McIntyre for his (and volunteers’) dogged program for locating USHCN surface stations and documenting their deplorable state and dodgy history, done in the 2006-2012 period. The photos and comments were sometime hilarious, while others made one wonder how any trustworthy policies could be derived from the data produced by the USHCN system.

Who needs accurate weather measurements when you have ‘models’ which can give you all you ‘need’ without having to worry about data which which may challenge the answers you need.
But it is odd with all the money flowing around AGW that the idea of making a network that overcomes the well known issue with the current one, gets little interest despite this being the ‘most important event ever’ , or so we are told .

“Weatherhead and a team that included four NOAA laboratory directors and many other prominent climate scientists urge that investments focus on tackling seven “grand challenges,”
such as predicting extreme weather and climate shifts,
the role of clouds and circulation in regulating climate,
the regional sea level change and coastal impacts,
understanding the consequences melting ice,
and feedback loops involving carbon cycling.”

For a belief that has yet to be proven conclusively; four NOAA Laboratory Directors want the United States to waste money chasing fantasies.

Those four directors that feel the need to avoid doing their job responsibly should be sacked. If they truly are directors, they should be SES executive salaried employees and subject to immediate dismissal without explanation.

Directors such as these are contributors to the world’s waste of funds towards pretending CO2 is deadly.

Any of those other “prominent climate scientists” who enjoy wasting the public’s money should have their funding removed.

I am not sure that the UHI effect is so noticeable in winter as some people assume large concrete and metal structures get cold also, I remember as a child standing on concrete terraces watching my local football team with my feet freezing they just take longer to do so. As regards heating then no one heats there homes to the extent that they heat the area around them , it is a question of cost. My car which is made of metal cools faster than the surrounding area and has frost even at slightly above freezing air temperatures, but the amount of cars etc. passing through urban areas may warm the climate a lot.

Place a thermometer where you park your car, then observe every night and morning…you will find that under conditions of clear sky and light (<5mph) winds, frost will form on car roofs at 38F. If there is wind or clouds, frost will not form no matter what the temp is.
Other surfaces will also get frost formation at 38F, if they are away from any over head objects, now near any concrete or other hard surfaces, and not near any heated structures. Another requirement is low heat conductance and/or low specific heat of the material of the surface in question.
Blades of grass will get frost. Metal will get frost. Large stones, bricks, and concrete will not, nor will bare moist ground, unless it is plowed or otherwise fluffed up.
Areas under or near leafed out trees will never get frost…ever.
I spent years running a commercial plant nursery in a rural part of Florida, and over a period of many years (1984 to 1998) engaged in meticulous observations of how and when frost forms, how and when it will not form, and the effect it has on a wide variety of live plants. And by meticulous, I mean spending thousands of nights sitting outside all night long, placing dozens of thermometers of various types and styles, and walking around and checking them frequently.
When every penny one has is riding on something, one pays attention, and remembers.
Reading what some very well credentialed people write about such things is eye opening…I am certain many of them have never spent a night sitting outside, or placed multiples thermometers at various locations in an area and observed the wide disparities that can often occur.

BTW…if you place a thermometer face up on the car roof, you will see it reads 32 when the frost begins to form, and the nearby air temp is 38 degrees 5 or 6 feet above the ground.
Also, the car must not be under any trees or other objects, or too close to a building. The mass of the car insulates the roof from heat from the paved driveway it is sitting on, as does the air inside the passenger compartment, the headliner and insulation above the headliner, etc.
Frost forms less readily on the trunk lid and hood of most vehicles, as these are usually just a sheet of metal, with no insulation underneath.

Another BTW…I used a lot of thermometers, and did not have tons of money to waste, and so I bought a lot of them of varying quality.
What I found was, once they are calibrated correctly, even very cheap alcohol in glass units give a fairly accurate reading…within 1 degree of a very expensive mercury in glass high/low unit. And 1 degree F is the limit of resolution of most units, so, basically…a $1.98 unit from Walmart reads the same as a $100 unit from a scientific supply house. Mostly.

As someone who worked in instrumentation for a number of years, I would say that as well as properly sited instrumentation you also need to use the same instrumentation for all future comparisons. If you are allowed to cherry-pick which time series you use for analysis you can pretty much show whatever you want to show.

You can easily try this in, say, Excel. Create several columns of random numbers. Find the slope of each series then just use the ones with a positive slope in your analysis. Bingo, you have just shown that the data is trending upwards.

Climate science seems to have far too many ways to put a thumb on the scale. Just use a particular tree, for instance, or find an excuse for not using the ARGO data, or forget about a few thousand stations, or …

Bottom line.
What we have in place globally could not possibly have provided the data reliable to give a “global temperature” now.
We just don’t know.
Pass what “we just don’t know” through a computer program or model and suddenly we DO know what will happen “then” unless we spend a few trillion now?

From the paper:
“The first fuzzy lens is that of natural variability of the climate system such as El Niño, Pacific Decadal Oscillations, solar variability, and volcanic eruptions. Most of this natural variability is caused by nonlinear interactions between the atmosphere and oceans that create “noise” in the climate system (e.g. El Niño). The noise of natural variability delays the time it takes to rigorously detect anthropogenic climate signals [Weatherhead et al., 1998; Leroy et al., 2008] and can confuse the public (e.g the so called “hiatus” of warming from 1998 to
2013).”

With all that vital stuff brushed aside, they can then focus on their ‘Climate Observing System Simulation Experiments’ to show how many trillions can be saved as the planet warms, how wonderful.

A bit of research on sea surface I’ve posted before, but worth bearing in mind, especially considering the methodology of the so-called “Pause Buster” paper:

In the decades before the advent of the significant coverage of the oceans by the buoy networks, the ocean temperature data was acquired in the main by ship’s engine room water inlet temperature data.

Ship’s engine cooling water inlet temperature data is acquired from the engine room cooling inlet temperature gauges by the engineers at their convenience, ie when they can be bothered.

There is no standard for either the location of the inlets with regard especially to depth below the surface, the position in the pipework of the measuring instruments or the time of day the reading is taken. They can be anywhere from close to the point where the water comes into the ship to well inside the engine room, close to the machinery.

The instruments themselves are of industrial quality, their limit of error in °C per DIN EN 13190 is ±2 deg C. for a class 2 instrument or sometimes even ±4 deg. C, as can be seen in the tables here: DS_IN0007_GB_1334.pdf . After installation it is exceptionally unlikely that they are ever checked for calibration.

It is not clear how such readings can be compared with the readings from buoy instruments specified to a limit of error of tenths or even hundreds of a degree C. or why they are considered to have any value whatsoever for the purposes to which they are put, which is to produce historic trends apparently precise to 0.001 deg. C upon which spending of literally trillions of £/$/whatever are decided.

But hey, this is climate “science” we’re discussing so why would a little thing like that matter?