Thirty Years of Error? In 1988, the Intergovernmental Panel on Climate Change (IPCC) was created by the UN under the World Meteorological Organization (WMO) and the United Nations Environmental Programme (UNEP) from a resolution by the UN General Assembly to address possible future, human-induced, climate change. The reports of the IPCC support the United Nations Framework Convention on Climate Change (UNFCCC). The objective of the UNFCCC is “stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system.” [Boldface added.]

The US Senate ratified the treaty becoming a party of the UNFCCC in 1992.
As readers of TWTW realize, the IPCC estimate future “anthropogenic interference with the climate system,” by using complex mathematical models prepared by others. As discussed in past TWTWs, the mathematical models fail basic testing – they fail to describe what is occurring in the atmosphere with changing greenhouse gases.
More particularly, as discussed in the September 15 and September 22 TWTWs, the publicly archived model runs from the 20 modeling groups participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) of the IPCC, as a group, greatly overestimate the warming trend occurring in the atmosphere. When specifically tested against the warming of a layer of the tropical troposphere, at 200 to 300 millibar, about 30,000 to 40,000 feet (9100 to 12,200m), they greatly overestimate the warming trends compared with three different datasets from weather balloons taken over the past 60 years. The test can be referred to as the McKitrick-Christy Hypothesis Test.

This failure prompts the question why? Given the extent of modeling expertise, the number of modeling groups involved, and the years of investigation, one must assume that this overestimate is not from a mathematical error in the models. Rather, it may be a systematic error in the thinking that goes into the formulation of these models.
The history of the IPCC by its first chairman, Swedish meteorologist Bert Bolin, gives a clue. Writing a chapter in the Encyclopedia of Life Support Systems after he left the IPCC, he states:

“The realization that human activities might change the global climate was not new. Already at the end of the nineteenth century Svante Arrhenius, professor of chemistry at Stockholm’s Högskola (University), deduced that the global mean temperature might increase by 5°C–6°C if the carbon dioxide concentration in the atmosphere were doubled.”

This claim is not correct. As discussed in last week’s TWTW, in his 1895 paper Arrhenius wrote:

“…temperature of the Arctic regions would rise about 8 degrees or 9 degrees Celsius, if the carbonic acid [CO2] increased 2.5 to 3 times its present value. In order to get the temperature of the ice age between the 40th and 50th parallels, the carbonic acid in the air should sink to 0.62 to 0.55 of present value (lowering the temperature 4 degrees to 5 degrees Celsius).”

In his later 1906 paper Arrhenius revised his estimates, writing:

“In a similar way, I calculate that a reduction in the amount of CO2 by half, or a gain to twice the amount, would cause a temperature change of – 1.5 degrees C, or + 1.6 degrees C, respectively.”

Since Bolin died in 2007, we may never know if he intentionally misrepresented the writings of Arrhenius, a fellow Swede. The difference between what Arrhenius first wrote and later wrote is significant.

Regardless, when Arrhenius wrote, the concept of the planet cooling by outgoing infrared radiation was not fully developed. His calculations did not have the benefit of 20th century research on the absorption and re-radiation effects of greenhouse gases together in the atmosphere and individually. It was pure speculation.

Further, Bolin worked with Jule Charney, the head of the group that produced the 1979 Charney Report, which claimed that the modest effect of carbon dioxide would be greatly amplified by the major greenhouse gas, water vapor. However, this was pure speculation because there were no comprehensive measurements of atmospheric temperature trends.
Also, Bolin discusses a second assessment report produced in 1982 by US National Research Council and headed by Joseph Smagorinsky, Director of the Geophysical Fluid Dynamics Laboratory of the National Oceanic and Atmospheric Administration (NOAA), which is located at Princeton. It found nothing wrong with the Charney Report, except atmospheric greenhouse gases were rising more quickly than expected.

It finds:“The 1979 Charney report estimated the equilibrium global surface warming from a doubling of CO2 to be ”near 3°C with a probable error of± l.5°C.” No substantial revision of this conclusion is warranted at this time.”
The Smagorinsky report “validated its climate models tests of the correctness of the models’ representation of the physical processes and from comparisons of the models’ responses· to known seasonal variations.”

Interestingly, the report states:“…Because decisions of immense social and economic importance may be made on the basis of model experiments, it is important that a comprehensive climate-model validation effort be pursued, including the assembly of a wide variety of observational data specifically for model validation and the development of a validation methodology.
“Validation of climate models involves a hierarchy of tests, including checks on the internal behavior of subsystems of the model. The parameters used in comprehensive climate models are explicitly derived, as much as possible, from comparisons with observations and/or are derived from known physical principles. Arbitrary adjustment or tuning of climate models is therefore greatly limited.

“The primary method for validating a climate model is to determine how well the model-simulated climate compares with observations. Comparisons of simulated time means of a number of climatic variables with observations show that modern climate models provide a reasonably satisfactory simulation of the present large-scale global climate and its average seasonal changes.

“More complete validation of models depends on assembly of suitable data, comparison of higher-order statistics, confirmation of the models’ representation of physical processes, and verification of ice models.“One test of climate theory can be obtained from empirical examination of other planets that in effect provide an ensemble of experiments over a variety of conditions. Observed surface temperatures of Mars, Earth, and Venus confirm the existence, nature, and magnitude of the greenhouse effect. [Boldface was italics in the original.]

The IPCC and its followers have not performed rigorous testing required for model validation. The IPCC testing is limited to determining which better describes the data used, the models with a calculated CO2 effect or without a CO2 effect. This is hardly rigorous and involves the use of the same data as was used to tune the models. Such testing is a form of circular reasoning.

The McKitrick-Christy Hypothesis Test avoids using data that was used to tune models, thus avoiding circular reasoning. Further the McKitrick-Christy test uses three datasets and shows that, whatever amplification is occurring, it is very modest. There is no empirical justification for the statement that a “doubling of CO2 to be ”near 3°C with a probable error of± l.5°C.”

Modern atmospheric temperature trends include the total effect of greenhouse gases, including CO2 and water vapor. Perhaps this is why the IPCC and its followers stick with surface measurements starting in the 1880s. The warming of the atmosphere does not indicate a “dangerous anthropogenic interference with the climate system.”

The above analysis indicates that the IPCC and its followers such as the US Global Change Research Program (USGCRP) rely on 19th century thinking and 19th century measurement techniques. As such, the IPCC maintains erroneous scientific thinking for its thirty years of existence. See links under Defending the Orthodoxy, the September 15 & 22 TWTWs and https://unfccc.int/resource/docs/co…

Electricity Costs in Germany: According to reports, small and mid-sized businesses, as well as retail consumers, are paying the price for German policies of closing nuclear power plants and adding unreliable solar and wind power to the mix. According to the EU statistical arm Eurostat, reported by Bloomberg, Germany has the highest electricity prices for household consumers (taxes included, second half of 2017) at, €0.305/kWh, Denmark is slightly below that. The EU average is €0.205/kWh. For their environmental purity, the Germans and Danes are paying about 50% more for their electricity than the European Union Average.

Another report in Bloomberg states that the doubling of electricity prices in Germany since 2016 is causing problems for the “Mittelstand.” The Mittlestand consists of small and mid-sized companies, often family firms, and employ about 32-million, over 60% workforce, and almost “all of which have sales of less than 1 million euros.”“While 2,000 corporate giants like Volkswagen AG and chemicals maker BASF SE have their own power plants and get exemptions from environmental tariffs, smaller companies pay more to absorb those costs.”

The same article gives a breakdown of Germany electricity prices: Network Costs (grid costs including costs of making electricity reliable) – 26%; Feed-in-Tariff (payments for renewable (unreliable) electricity – 24%; Power Generation -19%; Sales Tax – 16%; Electricity and Other Taxes – 10% and Concession Payment – 6%. Did California legislators and Governor Jerry Brown tell voters what to expect as they go 100% renewable, while shutting down nuclear?
Are the governors of New York, Virginia, etc. telling voters what to expect as they promote unreliable renewables? Thanks to hydraulic fracturing and horizontal drilling the US has inexpensive natural gas, which many of the same politicians are trying to prevent utilities and consumers from receiving. See links under Energy Issues – Non-US, California Dreaming, and https://ec.europa.eu/eurostat/stati…

Antarctic Melt: Writing in the journal Scientific Nordic, Antarctic researcher Valentina Barletta at DTU Space, Technical University of Denmark, gives a description of what GPS sensors show is happening in the Amundsen Sea Embayment. This is part of the West Antarctic Ice Sheet, which the IPCC, NASA-GISS, NOAA claim is collapsing.
Using this data, Barletta estimates the ice sheet grounding line of today, of 10,000 years ago, and of the end of the last ice age, about 18,000 years ago. She estimates the crust of the earth in this area is softer, more plastic, than suggested from other places such as North America, and rebounds to loss of the weight of ice faster than generally assumed.
Such research may give problems to the IPCC, NASA-GISS, and NOAA, which claim that sea level rise is increasing exponentially and will continue to do so. Part of their claim is that the ocean basins are sinking, disguising actual water accumulation. Thus, the tidal gages along coast lines are giving an inaccurate reading of sea level rise. Only the modelers, adjusting the data, give a proper estimate of “true sea level rise.” TWTW considers the claims that ocean basin sinking is disguising sea level rise in the same lines as James Hansen telling Haapala that atmospheric warming is being hidden by the Southern Oceans. See links Changing Cryosphere and Mr. Hansen’s web site, somewhere.

Sea Level Rise, Real or Imaginary? Writing in WUWT, James Steele, Director emeritus of San Francisco State University’s Sierra Nevada Field Campus, discusses some of the problems he encountered to prevent shoreline erosion. Similar problems occur when trying to prevent storm surges and flooding from rainfall. Steele’s residence is in Pacifica, along the Pacific coast, on the peninsula, south of San Francisco. The area is marked by high bluffs, subject to coastal erosion from the sea.
The California Coastal Commission is using projections of exponential sea level rise 3 to 10 feet by NOAA by the end of the century, to prevent any reasonable protections from coastal erosion. These projections are similar to claims by Bay Area municipalities against oil companies for damages from use of fossil fuels but ignored in their bond solicitations.
There is no excuse for the incompetence exhibited by NOAA, NASA-GISS, and others for producing reports of exponential sea level rise. These Federal government organizations are damaging the ability of communities to address practical problems, causing real harm.
CBS reports an interview with the “Dutch Water Ambassador” who discusses protective measures against storms. In 1953, The Netherlands experienced a disastrous storm from the North Sea and committed to not let it happen again, while maintaining one of the largest ports in Europe, Rotterdam. The cost is huge, according to the report the Dutch allocate more than a billion dollars a year to manage their flood infrastructure. Like New Orleans, much of The Netherlands is below sea level, requiring constant pumping.
Unfortunately, the article fails to distinguish between problems created by storm surges and those created by floods and by both. The defensive measures are different, depending on the major threats. For example, with Sandy it was storm surge from the ocean; with Katrina it was storm surge through Lake Pontchartrain; with Harvey it was rainfall; and with Florence it was both, first storm surge, then rainfall Florence.
Further, the article fails to discuss the intense objections raised by environmental organizations against engineering measures such as those used by the Dutch by invoking the National Environmental Policy Act (NEPA). Using NEPA, environment groups successfully stopped protective measures proposed by the Army Corps of Engineers, well before Katrina, to prevent storm surges flooding New Orleans. See links under Change in US Administrations, Changing Weather, and Changing Seas.

Number of the Week: $90,000 to 95,000 per day. “The rate for vessels shipping LNG from the Atlantic Basin to Asia has jumped to $US90,000 to $US95,000 a day this week, from $US75,000 a day at the end of August, brokers and traders said.” Assume the cost is only $US60,000 per day for an $200,000,000 LNG tanker. If the shipper takes the promoted “soon to become” trans-Arctic route and it is suddenly frozen in for 9 months (270 days), the shipper would face some $16,000,000 in leasing fees alone. The costs of provisions for the crew and the dangers the ice crushing the hull are another matter. No wonder few shipping companies are betting on the Arctic becoming “ice-free.” See link under Energy Issues – Non-US.

ARTICLES:1. A Cornell Scientist’s Downfall
Brian Wansink’s tale shows that academic peer review is broken.
By David Randall, WSJ, Sep 25, 2018https://www.wsj.com/articles/a-corn…
SUMMARY: The director of research at the National Association of Scholars writes:“The irreproducibility crisis cost Brian Wansink his job. Over a 25-year career, Mr. Wansink developed an international reputation as an expert on eating behavior. He was the main popularizer of the notion that large portions lead inevitably to overeating. But Mr. Wansink resigned last week as head of the Food and Brand Lab at Cornell University and professor at the Cornell SC Johnson College of Business after an investigative faculty committee found he had committed a litany of academic breaches: ‘misreporting of research data, problematic statistical techniques, failure to properly document and preserve research results’ and more.“Mr. Wansink defended himself to a reporter, claiming his work featured ‘no fraud, no intentional misreporting, no plagiarism, [and] no misappropriation.’ As of this week, however, he has ceased research, and he will retire at the end of the academic year.“Mr. Wansink’s fall from grace began with a 2016 blog post in which he blithely confessed to using improper research techniques known as p-hacking and HARKing. P-hacking involves running statistical analyses until they produce a statistically significant result; HARKing stands for ‘hypothesizing after the results are known.’ The post prompted a small group of skeptics to take a hard look at Mr. Wansink’s past scholarship. Their analysis, published in January 2017, turned up an astonishing variety and quantity of errors in his statistical procedures and data.“In April 2017, Columbia University statistician Andrew Gelman charged in his blog that Mr. Wansink was guilty of ‘serious research misconduct: either outright fraud by people in the lab, or such monumental sloppiness that data are entirely disconnected from context, with zero attempts to fix things when problems have been pointed out.’ Mr. Wansink has said he expects to be vindicated one day.“Cornell’s public judgment on Mr. Wansink is a milestone in the campaign to change how science works. Academic institutions have been slow to accept the gravity of the so-called irreproducibility crisis—the wide use of faulty research techniques that regularly produce results other scientists can’t replicate. Cornell is a force in the science world. Its actions are a sign that other academic institutions may at last be willing to change the culture of scientific research.“A generation of Mr. Wansink’s journal editors and fellow scientists failed to notice anything wrong with his research—a powerful indictment of the current system of academic peer review, in which only subject-matter experts are invited to comment on a paper before publication. Mr. Wansink’s resignation, on the other hand, points to the possibility of a cross-disciplinary approach to evaluating the reproducibility of scientific research. This new approach could even include criticism by nonscientists.“P-hacking, cherry-picking data and other arbitrary techniques have sadly become standard practices for scientists seeking publishable results. Many scientists do these things inadvertently, not realizing that the way they work is likely to lead to irreplicable [sic. irreproducible] results. Let something good come from Mr. Wansink’s downfall.