This concentration of power is simply antithetical to any free democratic society, especially when trillions of dollars are allocated based upon the data these organizations produce. The concentration creates the possibility of what economists call “regulatory capture,” and it is such a threat that President Eisenhower warned the nation about it in his farewell speech. The recent protesting of President Trump’s nominee to run EPA by EPA employees is a clear sign the organization is no longer representing the will of the people. In another branch of the government, that kind of behavior would get you court-martialed. Much like the Trusts of the early 20th Century, it is time to break up government “Robber Barons,” starting with the EPA.

‘What is ‘Regulatory Capture ‘

Regulatory capture is a theory associated with George Stigler, a Nobel laureate economist. It is the process by which regulatory agencies eventually come to be dominated by the very industries they were charged with regulating. Regulatory capture happens when a regulatory agency, formed to act in the public’s interest, eventually acts in ways that benefit the industry it is supposed to be regulating, rather than the public.

How then do we decide which temperature measurement is the best and most accurate? Real science already gives us the answer. In any real science, experiments are performed in a pretty standard manner.

The equipment used is state of the art

Measurements are taken in a very consistent, methodical manner

Confirmation of the measurements are made with other sources

Exogenous factors are controlled for

Raw data requires limited “adjustments”

Measure what is intended to be measured

The equipment used is state of the art:

The ground measurements use a variety of low-tech, highly inaccurate, easily distorted, unevenly distributed and widely different methods of temperature measurements. The recent NOAA Whistleblower case highlighted the problem with some of these data sources. Ground measurements are largely concentrated in the Northern Hemisphere, cover very little of the oceans, use various methods centered around the ancient technology of thermometers, are inconsistent, and suffer from a well-known anomaly called the “urban heat island effect.” This video clip highlights the dynamic and inconsistent method by which ground temperature measurements are taken. Because of the lack of coverage, the ground measurements have to “extrapolate” temperatures to develop a global temperature map. This has exposed an embarrassing flaw in the system when the recent claims of record high temperatures were supported by “data” from areas where there are no ground measurements…ooops. It is also important to note that the Climategate emails and the NOAA Whistleblower exposed flaws in the ground measurements, not the satellite measurements.

This slideshow requires JavaScript.

Satellite data, on the other hand, meet all the requirements of good scientific measurement techniques. You don’t get more state of the art than NASA satellites. In fact, the people that run the NASA satellites are so proud of how well they run their system they write articles about it.

An incredible amount of work has been done to make sure that the satellite data are the best quality possible. Recent claims to the contrary by Hurrell and Trenberth have been shown to be false for a number of reasons, and are laid to rest in the September 25th edition of Nature (page 342). The temperature measurements from space are verified by two direct and independent methods. The first involves actual in-situ measurements of the lower atmosphere made by balloon-borne observations around the world. The second uses intercalibration and comparison among identical experiments on different orbiting platforms. The result is that the satellite temperature measurements are accurate to within three one-hundredths of a degree Centigrade (0.03 C) when compared to ground-launched balloons taking measurements of the same region of the atmosphere at the same time.

Measurements are taken in a very consistent, methodical manner:

The ground measurements are taken using countless different instruments, differing locations and times, different staff with different skills (and sobriety when taken on a college campus on a Saturday morning), and different methods of recording. The ground measurements use mercury, alcohol, bimetallic and other thermometers, located in various types of Stevenson’s Screens, ocean buoys, and ships. Anthony Watts of the WattsUpWithThat Blog keeps track of a weather station audit. I participated in a few weather station audits and was personally shocked to learn how these measurements are actually taken (someone literally goes out and looks at a thermometer and records the temperature using a pencil). My conclusion was that these measurements were never intended to justify spending programs measured in the trillions of dollars. Such responsibility would never be entrusted to college kids just looking to make a couple extra bucks…or so I thought. This video clip highlights how inconsistent the ground measurement locations are, and the ship-born measurements are even worse.

Earth’s lower atmosphere as measured by orbiting satellites. And while these data are exceedingly precise, verified by multiple satellite observations, and balloon measurements taken in-situ

Confirmation of the measurements are made with other sources:

There are three main measurement approaches; 1) Ground and Sea Measurements 2) Balloon and 3) Satellite. As the quote above states, balloon and satellite measurements confirm each other, balloon and satellite data do not confirm ground measurements.

Exogenous factors are controlled for:

The surface station measurements are corrupted by a well-known problem called the “urban heat island effect,” but I would argue that is just the start of it. The thermometers aren’t standardized, the locations are often changed, the locations are often out of specifications, the staff training for recording the measurements is highly variable, locations are added and dropped, coverage of the globe is only partial, data is collected at many many independent sites and compiled at a few government-run centers, and the list goes on and on and on. Once that data is collected, a few “experts” make “adjustments” to the data to account for the known “errors.” The recent NOAA Whistleblower case exposed the weakness of trusting “experts,” whose jobs security is dependent upon the conclusions reached from the data, to make “adjustments.” It turns out all the “adjustments” tend to favor the desired outcome.

Satellite data, on the other hand, is collected by relatively few highly standardized polar orbiting satellites, collecting highly standardized data, run by relatively few highly skilled professionals, and compiled in a rather transparent manner at a single location. The integrity of the satellite data is impeccable, whereas the integrity of the ground measurements is almost non-existent. Unlike the ground measurements, the accountability regarding the satellite measurements is clearly defined. If there is any monkey business regarding the satellite data, you can blame Dr. Christy and Dr. Spencer at the UHA.

Every month, John Christy and I update global temperature datasets that represent the piecing together of the temperature data from a total of fourteen instruments flying on different satellites over the years. A discussion of the latest version (6.0) of the dataset is located here.

Raw data requires limited “adjustments:”

Ground and sea temperature measurements undergo not only adjustments to current data, but for some reason, appear to under continual “retroactive adjustments.” The “adjustment” process is made in an extremely non-transparent and secretive manner, resulting is a closet industry of temperature tampering detectives. These super sleuths make a mockery of the ground measurements data by documenting its progression. The Climategate emails and NOAA Whistleblower expose the corrupting of the ground measurement “adjustment” process.

Satellite data undergoes a rather uniform and transparent “adjustment” process which has remained relatively scandal-free. It, of course, has been attacked, but those attacks have been refuted.

“In particular, we’ve examined these two `breaks’ claimed by Hurrell and Trenberth. Even in these disputed intervals, we find excellent agreement between the two independent, direct atmospheric temperature measurements from balloons and satellites.”

Measure what is intended to be measured:

All this data is being used to push the false narrative that man-made CO2 is causing undesirable climate change/global warming, which is odd considering many of the “experts” pushing this agenda live on the coast in sunny warm Malibu. Anyway, if one wants to measure the impact CO2 has on the temperature, one would focus on the areas of the globe and atmosphere where the impact of CO2 is isolated. Nearly 100% of ground and sea measurements are taken in the lowest 1km of the atmosphere, that is why they are called ground and sea measurements. According to MODTRAN, NASA’s program for modeling the atmosphere, CO2 has absolutely no impact on the lowest 1km of the atmosphere, at least up to the twice the CO2 level we have today. Let me repeat that. CO2 has no measurable impact on the lowest 1km of the atmosphere for a doubling of atmospheric CO2. In other words, the ground and sea measurements are located in the area of the atmosphere where CO2 has no measurable impact. Pay attention to the Upward IR Heat Flux value and CO2(ppm).

This slideshow requires JavaScript.

The first signs of a CO2 signature appear about 3 km up in the atmosphere after H2O has started to precipitate out of the atmosphere. The one area of the earth where ground measurements could identify a CO2 signature would be at the South Pole, where the air is extremely dry. The South Pole shows no warming over the past 50+ years, so the areas where ground measurements could make the case for CO2 driven global warming, they prove just the opposite. Satellite and balloon measurements, on the other hand, measure the layers of the atmosphere where CO2 would have an impact on temperature.

Where Am I Going With This?:

One of the biggest mysteries I’ve been trying to solve is “how accurate are the ground measurement?” All the spending programs are based upon the ground measurements being accurate. All the IPCC models rely on the ground measurements. A whole lot is riding on the accuracy of those measurements. The problem is, events like the Climategate emails, the “Hockeystick,” and now the NOAA Whistleblower all raise serious questions as to their reliability and accuracy. The complete and utter failures of the models that use these data sets don’t help the case.

From the above analysis of the data sets, a simple solution to greatly improve the accuracy of the ground measurements should be apparent. Any real scientist facing these challenges would simply remove the questionable data sets. CO2 evenly blankets the globe, there is no need for temperature measurement to be concentrated in urban areas where countless factors other than CO2 impact the proximal temperature. CO2 is 400 ppm in the city and 400 ppm 10 miles away in the farmland. If one is trying to get an accurate reading of temperature and the impact of CO2, one would remove the urban temperature and use the temperature taken in the farmland. That is how a real scientist seeking the truth would “adjust” the data set. Unfortunately, the people in charge of the ground measurements did just the opposite, they dropped the non-urban thermometers, thousands of them. Unfortunately, that isn’t a joke. How this is not criminal data tampering is beyond me.

One of the most unheralded means by which this temperature “shaping” occurs has been the tendentious and wholesale removal of thousands of weather station land thermometers from remote, high altitude, and/or non-urban locations since the 1970s. These are stations which do not show the warming trends predicted by models, as they are not affected by proximity to artificial or non-climatic heat sources (pavements, buildings, machinery, industry, etc.) like urban weather stations are. (As detailed below, locating thermometers near urban heat sources can cause warming biases of between 0.1 and 0.4°C per decade.)

This slideshow requires JavaScript.

Fortunately, there are a few real scientists left out there, and one of them is Dr. “Willie” Soon from the Harvard-Smithsonian Center for Astrophysics. Recently he gave a talk to the American Freedom Alliance where he covered his research. Click here to watch the video. It is outstanding, entertaining, empowering and educational. What did Dr. Soon discover when the offending data was removed? Basically, all the warming demonstrated in the current ground measurement reconstructions mysteriously disappears…poof. Imagine that, applying sound scientific practices to the data sets magically makes the illusion of warming disappear. His research shows that temperatures are basically unrelated to CO2, and instead, are highly correlated with, you guessed it, the sun. Imagine that, the thing that warms the earth every morning, drives the seasons, and provides basically 100% of all incoming warming radiation, that giant nuclear reactor in the sky, is impacting global temperatures. Who wood’a thunk it?

Can you explain just what you mean in the MODTRANS figures? I am a geologist, and I need to know what you are saying here. The explanations are just a bit cryptic (for as geologist). Are you saying that in MODTRANS at 1km height, there are no absorption bands showing in the profiles, compared with, for example, heights of 70km. In turn, that means there is no downwards radiative effect of CO2 etc at 1 km height?

All help would be appreciated. My apologies for any ignorance displayed.

MODTRAN is the program people use to model the atmosphere. You can either look down at the earth, or up to the sky. The earth emits a certain level of W/M^2, deviations from that shows the amount of energy “trapped” in the atmosphere. If you notice, doubling CO2 has no impact on the W/M^2 looking down from 1km.

“Are you saying that in MODTRANS at 1km height, there are no absorption bands showing in the profiles, compared with, for example, heights of 70km. In turn, that means there is no downwards radiative effect of CO2 etc at 1 km height?”

CO2 and H20 both absorb 13 to 18 micron IR. H2O is far more abundant and potent of a GHG, so anywhere CO2 and H2O mix in the atmosphere, CO2 essentially becomes irrelevant. It is like putting an additional 20th layer of black paint on a window. With or without CO2, the W/M^2 in the lower 1km of the atmosphere remains the same. Only at higher altitudes, after H2O precipitates out, do you start to see a CO2 signature. The best control for CO2 is Antarctica, where is air is very cold and very very very dry. When you isolate the impact of CO2, there is no evidence of warming over the past 50 years, even though CO2 has increased significantly. Hope that helps.