Paul Ehrlich predicted an imminent population catastrophe—Julian Simon wagered he was wrong.

By

JONATHAN V. LAST

WSJ.COM 8/30/13: It is difficult to comprehend the hysteria about overpopulation that once gripped America. In 1965, the New Republic, one of America's foremost journals of public affairs, wrote that "world population has passed food supply. The famine has started." The magazine was so convinced of the coming cataclysm that it proclaimed world hunger to be the "single most important fact in the final third of the 20th century." This period, in which large portions of America's intellectual and political elites took leave of their senses and predicted something like the literal end of civilization, is the subject of Paul Sabin's brief, but valuable, book "The Bet: Paul Ehrlich, Julian Simon, and Our Gamble Over Earth's Future."

The Bet

By Paul SabinYale, 304 pages, $28.50

Ken Fallin

Mr. Ehrlich, a biologist specializing in butterflies, became famous in the 1970s after publishing "The Population Bomb" (1968), in which he updated the 19th-century projections of Thomas Malthus—people were overbreeding, the supply of food and resources couldn't possibly keep up—and dialed the calamity to 11. Within a few short years, hundreds of millions of people would starve to death as civilization unraveled. Or so predicted Mr. Ehrlich. "The Population Bomb" was reprinted 22 times in the first three years alone, and its author would appear as Johnny Carson's guest on "The Tonight Show" at least 20 times, becoming a national figure and an influential player in Democratic politics. Mr. Ehrlich's ideas attracted a remarkable number of passionate adherents. They also attracted the scornful criticism of a little-known economist named Julian Simon.

When he began exploring demographics, Simon, too, had been concerned about overpopulation. But the more he studied the subject, the more he became convinced that Mr. Ehrlich's thesis was fundamentally flawed. Mr. Ehrlich believed that the laws of nature that governed insects also applied to humans, that natural constraints created cycles of population booms and busts. Simon believed that man's rational powers—and the economies man constructed—made those laws nearly obsolete.

So in 1980 Simon made Mr. Ehrlich a bet. If Mr. Ehrlich's predictions about overpopulation and the depletion of resources were correct, Simon said, then over the next decade the prices of commodities would rise as they became more scarce. Simon contended that, because markets spur innovation and create efficiencies, commodity prices would fall. He proposed that each party put up $1,000 to purchase a basket of five commodities. If the prices of these went down, Mr. Ehrlich would pay Simon the difference between the 1980 and 1990 prices. If the prices went up, Simon would pay. This meant that Mr. Ehrlich's exposure was limited while Simon's was theoretically infinite.

Simon even allowed Mr. Ehrlich to rig the terms of the bet in his favor: Mr. Ehrlich was allowed to select the five commodities that would be the yardstick. Consulting two colleagues, John Holdren and John Harte, Mr. Ehrlich chose chromium, copper, nickel, tin and tungsten, each of which his team supposed was especially likely to become scarce. As they settled on their terms, Mr. Sabin notes, Messrs. Ehrlich, Holdren and Harte "felt confident that they would prevail."

They didn't. In October 1990, Mr. Ehrlich mailed a check for $576.07 to Simon. Mr. Sabin diplomatically reports that "there was no note." Although world population had increased by 800 million during the term of the wager, the prices for the five metals had decreased by more than 50%. And they did so for precisely the reasons Simon predicted—technological innovation and conservation spurred on by the market.

Mr. Ehrlich was more than a sore loser. In 1995, he told this paper: "If Simon disappeared from the face of the Earth, that would be great for humanity." (Simon would die in 1998.) This comment wasn't out of character. "The Bet" is filled chockablock with Mr. Ehrlich's outbursts—calling those who disagree with him "idiots," "fools," "morons," "clowns" and worse. His righteous zeal is matched by both his viciousness in disagreement and his utter imperviousness to contrary evidence. For example, he has criticized the scientists behind the historic Green Revolution in agriculture—men like Norman Borlaug, who fed poor people the world over through the creation of scientific farming—as "narrow-minded colleagues who are proposing idiotic panaceas to solve the food problem."

Mr. Sabin's portrait of Mr. Ehrlich suggests that he is among the more pernicious figures in the last century of American public life. As Mr. Sabin shows, he pushed an authoritarian vision of America, proposing "luxury taxes" on items such as diapers and bottles and refusing to rule out the use of coercive force in order to prevent Americans from having children. In many ways, Mr. Ehrlich was an early instigator of the worst aspects of America's culture wars. This picture is all the more damning because Mr. Sabin paints it not with malice but with sympathy. A history professor at Yale, Mr. Sabin shares Mr. Ehrlich's devotion to environmentalism. Yet this affinity doesn't prevent Mr. Sabin from being clear-eyed.

At heart, "The Bet" is about not just a conflict of men; it is about a conflict of disciplines, pitting ecologists against economists. Mr. Sabin cautiously posits that neither side has been completely vindicated by the events of the past 40 years. But this may be charity on his part: While not everything Simon predicted has come to pass, in the main he has been vindicated. As Nobel Prize-winning economist Robert Solow put it, "the world has been exhausting its exhaustible resources since the first cave-man chipped a flint."

Mr. Ehrlich may have been defeated in the wager, but he has continued to flourish in the public realm. The great mystery left unsolved by "The Bet" is why Paul Ehrlich and his confederates have paid so small a price for their mistakes. And perhaps even been rewarded for them. In 1990, just as Mr. Ehrlich was mailing his check to Simon, the MacArthur Foundation awarded him one of its "genius" grants. And 20 years later his partner in the wager, John Holdren, was appointed by President Obama to be director of the White House Office of Science and Technology Policy.

—Mr. Last is a senior writer at the Weekly Standard and the author of "What to Expect When No One's Expecting."

Bottom graph shows temperature reconstruction over the past 563 years. Note the Medieval Warming Period was ~1000 years ago and is not shown in this reconstruction. Top graph shows reconstructed temperatures compared to actual observed temperatures.

Abstract

We developed a tree-ring chronology (AD 1446–2008) based on 75 cores from 37 Abies squamata Mast. trees from the Shaluli Mountains, southeastern Tibet Plateau, China, using signal-free methods, which are ideally suited to remove or reduce the distortion introduced during traditional standardization. This chronology correlates best with regional temperatures in June–July, which allowed us to develop a June–July temperature reconstruction that explained 51.2% of the variance in the instrumental record. The reconstruction showed seven cold periods and five warm periods. Cold periods were identified from AD 1472 to 1524, 1599 to 1653, 1661 to 1715, 1732 to 1828, 1837 to 1847, 1865 to 1876 and 1907 to 1926. Warm intervals occurred from AD 1446 to 1471, 1525 to 1598, 1716 to 1731, 1848 to 1864, 1877 to 1906 and 1927 to present. The reconstruction agrees well with nearby tree-ring-based temperature reconstructions. Spatial correlation analyses suggest that our reconstructions provide information on June–July temperature variability for the southeastern Tibetan Plateau and its vicinity. Spectral analyses revealed significant peaks at 2–6, 10.7, 51.2, 102.2 and 204.8 yr. The temperature variability in this area may be affected by ENSO, the Pacific Decadal Oscillation and solar activity.

Please Join Us for the

Climate Change Reconsidered Conference Call Series

You are invited to join The Heartland Institute for an exclusive conference call series previewing the Nongovernmental International Panel on Climate Change’s (NIPCC)Climate Change Reconsidered II: Physical Scienceahead of its September 17 release. Each Tuesday in September this conference call series will bring authors and contributors of the report to talk about the report’s findings and to answer some of your questions.

This report is the result of collaboration among three organizations: Science & Environmental Policy Project, Center for the Study of Carbon Dioxide and Global Change, and The Heartland Institute. It was coauthored and coedited by Craig D. Idso, Robert M. Carter, and S. Fred Singer.

Like its predecessor reports, this volume provides the scientific balance that is missing from the overly alarmists reports of the United Nations’ Intergovernmental Panel on Climate Change (IPCC), which are highly selective in their review of climate science and controversial with regard to their projections of future climate change.

Join us on Tuesday, September 3, at 1:00 p.m. EST / Noon CST for the first NIPCC conference call of the series. Our special guests will be Joseph Bast, President and CEO of The Heartland Institute and Craig D. Idso, author and editor of Climate Change Reconsidered II: Physical Science who will be giving an overview and exclusive preview of this important report.

More problems for climate models: A new paper published in the Journal of Geophysical Research-Atmospheres finds that models must take into account not only the presence or absence of clouds but also how clouds are stacked vertically. The authors find that changes in vertical stacking of clouds can change radiative forcing assumptions by a factor of two [100%]. However, state of the art climate models do not take vertical stacking into consideration, and most global datasets of cloudiness also do not contain this information. "Clouds, which can absorb or reflect incoming radiation and affect the amount of radiation escaping from Earth's atmosphere, remain the greatest source of uncertainty in global climate modeling," and according to this paper, that uncertainty has just doubled from what was previously thought.

"There are known knowns; there are things we know that we know.There are known unknowns; that is to say, there are things that we now know we don't know.But there are also unknown unknowns – there are things we do not know we don't know."

From today's AGU Journal Highlights:Atmosphere's emission fingerprint affected by how clouds are stackedClouds, which can absorb or reflect incoming radiation and affect the amount of radiation escaping from Earth's atmosphere, remain the greatest source of uncertainty in global climate modeling.By combining space-based observations with climate models, researchers are able to derive baseline spectral signals, called spectral fingerprints, of how changes in the physical properties of the Earth's atmosphere, such as the concentration of carbon dioxide or the relative humidity, affect the amount of radiation escaping from the top of the atmosphere. Researchers can then use these spectral fingerprints to attribute changes in the observed top-of-atmosphere radiation to changes in individual atmospheric properties. However, recent research has shown that the way global climate models represent the interactions between clouds and radiation can complicate the process of making these spectral fingerprints. Researchers are finding that what matters is not only the presence or absence of clouds at each location represented in the model but also how the clouds are stacked vertically within each model grid.Using a simulation experiment to mimic the future climate, Chen et al. tested how different approaches to parameterize cloud stacking affect the attributions of climate change signals in the longwave spectra recorded at the top of the atmosphere. The authors tested three approaches to parameterize cloud stacking and find that the differences in stacking assumptions affected the modeled global mean for outgoing longwave radiation by only a few watts per square meter. The global average for outgoing longwave radiation at the top of the atmosphere is around 240 watts per square meter. However, based on which parameterization is used, similar changes in the portion of the sky covered by clouds (especially the clouds in the middle and lower troposphere) can lead to spectral fingerprints that differ by up to a factor of two in the amplitude.Source: Journal of Geophysical Research-Atmospheres, doi:10.1002/jgrd.50562, 2013 http://onlinelibrary.wiley.com/doi/10.1002/jgrd.50562/abstractTitle: Non-negligible effects of cloud vertical overlapping assumptions on longwave spectral fingerprinting studiesAuthors: Xiuhong Chen and Xianglei Huang: Department of Atmospheric, Oceanic, and Space Sciences, University of Michigan, Ann Arbor, Michigan, USA;Xu Liu: NASA Langley Research Center, Hampton, Virginia, USA.ABSTRACT: In order to monitor and attribute secular changes from outgoing spectral radiances, spectral fingerprints need to be constructed first. Large-scale model outputs are usually used to derive such spectral fingerprints. Different models make different assumptions on vertical overlapping of subgrid clouds. We explore the extent to which the spectral fingerprints constructed under different cloud vertical overlapping assumptions can affect such spectral fingerprinting studies. Utilizing a principal component-based radiative transfer model with high computational efficiency, we build an OSSE (Observing System Simulation Experiment) with full treatment of subgrid cloud variability to study this issue. We first show that the OLR (outgoing longwave radiation) computed from this OSSE is consistent with the OLR directly output from the parent large-scale models. We then examine the differences in spectral fingerprints due to cloud overlapping assumptions alone. Different cloud overlapping assumptions have little effect on the spectral fingerprints of temperature and humidity. However, the amplitude of the spectral fingerprints due to the same amount of cloud fraction change can differ as much as a factor of two between maximum random versus random overlap assumptions, especially for middle and low clouds. We further examine the impact of cloud overlapping assumptions on the results of linear regression of spectral differences with respect to predefined spectral fingerprints. Cloud-relevant regression coefficients are affected more by different cloud overlapping assumptions than regression coefficients of other geophysical variables.These findings highlight the challenges in constructing realistic longwave spectral fingerprints and in detecting climate change using all-sky observations.Related: Climate models have been falsified at a confidence level of >98% over the past 15 years, and falsified at a confidence level of 90% over the past 20 years.

That's what an interesting new article in Nature Climate Change points out. The article, "Overestimated warming over the past 20 years," by members in good standing of the "climate community" compares model simulations from 37 of the climate models being used by the Intergovernmental Panel on Climate Change to project future temperatures with the actual global temperature increase over the past two decades. The study (since it's behind a paywall I am linking to the version published online at the AGW skeptical website the Hockeyschtick) reports:

Global mean surface temperature over the past 20 years (1993–2012) rose at a rate of 0.14 ± 0.06 °C per decade (95% confidence interval)1. This rate of warming is significantly slower than that simulated by the climate models participating in Phase 5 of the Coupled Model Intercomparison Project (CMIP5). To illustrate this, we considered trends in global mean surface temperature computed from 117 simulations of the climate by 37 CMIP5 models. These models generally simulate natural variability — including that associated with the El Niño–Southern Oscillation and explosive volcanic eruptions — as well as estimate the combined response of climate to changes in greenhouse gas concentrations, aerosol abundance (of sulphate, black carbon and organic carbon, for example), ozone concentrations (tropospheric and stratospheric), land use (for example, deforestation) and solar variability. By averaging simulated temperatures only at locations where corresponding observations exist, we find an average simulated rise in global mean surface temperature of 0.30 ± 0.02 °C per decade (using 95% confidence intervals on the model average). The observed rate of warming given above is less than half of this simulated rate, and only a few simulations provide warming trends within the range of observational uncertainty... (emphasis added). ...

The inconsistency between observed and simulated global warming is even more striking for temperature trends computed over the past fifteen years (1998–2012). For this period, the observed trend of 0.05 ± 0.08 °C per decade is more than four times smaller than the average simulated trend of 0.21 ± 0.03 °C per decade. It is worth noting that the observed trend over this period — not significantly different from zero — suggests a temporary 'hiatus' in global warming. (emphasis added).

The article concludes:

Ultimately the causes of this inconsistency will only be understood after careful comparison of simulated internal climate variability and climate model forcings with observations from the past two decades, and by waiting to see how global temperature responds over the coming decades.

It seems to me what the researchers are saying in so many words is that the current batch of climate models have not been validated using actual temperature trends. One possibility for the mismatch between actual temperature trends and the model projections is that the modelers have set climate sensitivity (response of the climate to a doubling of atmospheric carbon dioxide) too high. As I have reported before, more recent research has significantly lowered estimates for climate sensitivity which suggests that future warming will also be lower.

The graphic above depicts the global lower troposphere temperature projections from 73 CMIP5 models from 1979 to 2025 compared to an average of the satellite data from UAH and RSS (blue boxes) and weather balloons (green circles) for the global lower troposphere temperatures since 1979 until now. Note nearly all the model runs project much warmer temperatures than the globe has recently experienced. The thick black line is the average projection of the 73 models.

Next month the Intergovernmental Panel on Climate Change is set to release its update on the physical science of climate change. It will be interesting to see how (or if) its authors try to explain the growing gap between the model projections and the actual climate.

A paper published today in Global and Planetary Change reconstructs sea levels in SE Vietnam over the past 8,000 years and finds sea levels rose 37 mm/yr from 8,100 to 6,400 years ago to ~1.5 meters [5 feet] higher than the present, but dropped from 6,000 years ago to the end of the reconstruction 630 years ago.

Highlights

Slight sea-level drop at an average rate of 0.25 mm/yr after 5000 yr BP.

Abstract

Beachrocks, beach ridge, washover and backshore deposits along the tectonically stable south-eastern Vietnamese coast document Holocene sea level changes. In combination with data from the final marine flooding phase of the incised Mekong River valley, the sea-level history of South Vietnam could be reconstructed for the last 8000 years. Connecting saltmarsh, mangrove and beachrock deposits the record covers the last phase of deglacial sea-level rise from -5 to + 1.4 metres between 8.1 to 6.4 ka. The rates of sea-level rise decreased sharply after the rapid early Holocene rise and stabilized at a rate of 4.5 mm/yr between 8.0 and 6.9 ka. Southeast Vietnam beachrocks reveal that the Mid-Holocene sea-level highstand slightly above + 1.4 m was reached between 6.7 and 5.0 ka, with a peak value close to + 1.5 m around 6.0 ka. This highstand is further limited by a backshore and beachridge deposit that marks the maximum springtide sea-level just below the base of the overlying beach ridge. After 5.0 ka sea level dropped below + 1.4 m and fell almost linearly at a rate of 0.24 mm/yr until 0.63 ka and + 0.2 m as evidenced by the youngest beachrocks.

The Holocene sea-level fluctuations observed in Southeast Vietnam resulted from eustatic and isostatic processes. The sea-level rise up to the mid-Holocene highstand was provoked by the last melting phase of glacial polar ice-sheets. The sea-level drop after the mid-Holocene highstand was induced by the isostatic processes of Continental Levering with an uplift of continents in low latitudes and depression of adjacent flooded continental shelf areas and Equatorial Ocean Siphoning transferring oceanic waters from low latitudes to the increasing volume of oceanic basins in higher latitudes. The regional expression in terms of magnitude and timing of relative sea-level change might contribute to validation of geophysical model simulations.

Koomey says fighting it again now is pretty frustrating, “I’d rather not have to spend time rehashing this stuff.” But, the claim is back. So Koomey is back; figuring out just how much electricity goes into making and using my smartphone.

By his calculation, it’s about 60 kilowatt-hours.

Mark Mills, a senior fellow at the Manhattan Institute, and the author of the phone-equals-refrigerator claim, estimates it’s closer to 700 kilowatt-hours.

Mills is author of a report called The Cloud Begins with Coal, sponsored by the mining and coal industries. He says he wants to get people thinking about how much electricity these devices use. And he doesn’t think the controversy around the refrigerator analogy distracts people from his bigger point.

He stands by his calculations and his main assertion: “It is accurate: it uses a lot of electricity. Now if someone were to say, it’s not equal to a refrigerator or equals half a refrigerator or a tenth of a refrigerator, that’s still a big number.”

Why use this analogy again? Why compare a phone to a fridge, when Mills got so blasted the first time?

“If I came up to you and remarked to you that there is a one-headed cat around the corner from your house you would be totally uninterested,” says Bruce Nordman, a research scientist at Lawrence Berkeley National Laboratory*, “but if I said there was a three-headed cat you’d be amazed that it exists and want to go see it; so these fantastical assertions naturally attract people’s attention, whether or not they are real.”

Nordham says the idea that our phones use as much energy as a fridge is basically that three-headed cat; it’s not real. And still, these things get picked up, and passed around.

Which raises another question -- why?

“Thinking about a smartphone, a tiny small device, that sits in our pocket using the same amount of energy as a huge refrigerator, seems so amazing that we just have to share us with someone else,” says Jonah Berger, a marketing professor at the Wharton school and author of "Contagious: Why Things Catch On." “It’s a neat little factoid that makes us look smart, even if in this case, it’s not actually true.”

He says the controversy around it helps makes it sticky and it taps into a broader conversation about the environment. “If everyone is talking about the environment, they are looking for something to add to that conversation,” Berger says. “We all know that gas prices are up, what's there to say that’s new? But if I can plug in a new fact to that conversation, it’s going to get talked about a lot.”

An interesting week in climate change science and climate change politics - sometimes a little difficult to distinguish between them!

We have a new paper published in Nature which ties the current hiatus in global-warming to cooling in the eastern equatorial pacific, a sample area covering just 8.2% of the ocean surface. I quote: "Our results show that the current hiatus is part of natural climate variability, tied specifically to a La-Niña-like decadal cooling."

We also have another paper, published the same day in Nature Climate Change, seeking to explain the mismatch between observed and estimated warming over the last twenty years, specifically with reference to "some combination of errors in external forcing, model response and internal climate variability".

Judith Curry has examined both of these papers in her blog and I shall be making reference to her important comments on them.

We are also approaching crunch time in the Arctic, with predictions of continuing catastrophic melt by alarmists looking to be increasingly untenable. In fact, 2013 is shaping up to be the year when Arctic ice started to recover at record rates when compared to the all time (well, since 1978 anyway) low recorded in Sept 2012. No doubt, as with the global warming pause, we will soon be hearing from the Arctic alarmists that the 'death spiral' of sea-ice decline was never predicted to be a continuous one and therefore we should expect interruptions (albeit, massive, record breaking ones) in the downward trend.

It looks to me like we are now seeing the beginnings of an acknowledgement by the wider climate science community that natural processes (in particular those mediated by ENSO and PDO cycles - El Nino Southern Oscillation and Pacific Decadal Oscillation, respectively) can have a significant effect on our climate in that they can interrupt rapid predicted CO2 mediated warming to create a 15 year pause and even set the planet off cooling again. However, there seems to be a lack of an accompanying acknowledgement, inherent in this argument, that therefore natural climate forcings have a far more significant role to play in climate change, comparable to, or greater even, than hypothesised anthropogenic influences. Which leads us to the obvious, and the ultimate in climate change heresy, i.e. that 20th/21st Century warming trends have been contributed significantly to by natural influences at play in our ocean/atmosphere coupled climate system, even - perish the thought - that natural forcings have dominated that recent warming trend. We see this reluctance to take the next logical steps very clearly in the following comment taken from the Nature paper referenced above: "Although similar decadal hiatus events may occur in the future, the multi-decadal warming trend is very likely to continue with greenhouse gas increase".

What we basically have at the moment is a PDO in negative phase which is tending to put a lid on world temperatures and even drag them down. ENSO at the present time is fairly neutral, i.e. neither La Nina or El Nino. ENSO is, to a large measure, constrained within the longer period PDO and moderated by this longer period cycle, but ENSO can, and does, flip polarity whilst PDO remains either positive or negative, thereby reinforcing warming/cooling, or tending to act against the prevailing PDO trend. So if we get a strong La Nina soon, this will tend to exacerbate global cooling, and vice versa if ENSO goes into El Nino phase.

Hence, the anthropogenic global warming predicted by climate scientists is, for a variety of reasons, not showing itself and this is placing global warming policy advocates and scientists in a tough position. They may argue that this is a temporary glitch, a minor interruption in an otherwise continual late 20th century/21st century 'catastrophic' warming trend. The problem is, it is neither minor, nor particularly temporary, lasting anywhere between 12 and 20 years, depending on how you look at the figures and which dataset you choose to use. Nor also was it widely predicted, as claimed, by the climate models used by the IPCC, particularly those developed in the 1990s. Even now, a 15 year hiatus in global warming is a very rare occurrence in model runs and let us not forget, it is the climate models, not observations, which still drive current thinking on the threat posed by a hypothetical anthropogenic global warming. Natural variability was, and still is, regarded as minor compared to the effects of man-made CO2 on climate change according to IPCC scientists - though we wait with baited breath for the soon to be published finally completed AR5.

Judith Curry quotes excerpts from the Nature Climate Change paper here. On the subject of the 15 year hiatus starting in 1998, the authors say:

"The inconsistency between observed and simulated global warming is even more striking for temperature trends computed over the past fifteen years (1998–2012). For this period, the observed trend of 0.05 ± 0.08 °C per decade is more than four times smaller than the average simulated trend of 0.21 ± 0.03 °C per decade. The divergence between observed and CMIP5- simulated global warming begins in the early 1990s, as can be seen when comparing observed and simulated running trends from 1970–2012.

The evidence, therefore, indicates that the current generation of climate models (when run as a group, with the CMIP5 prescribed forcings) do not reproduce the observed global warming over the past 20 years, or the slowdown in global warming over the past fifteen years. [S]uch an inconsistency is only expected to occur by chance once in 500 years, if 20-year periods are considered statistically independent." The authors conclude that some "combination of errors in external forcing, model response and internal climate variability" is to blame for these large discrepancies. They tentatively identify ENSO, volcanic activity and the AMO (Atlantic Multi-Decadal Oscillation) as playing their part, though admit that the amplitude of these natural influences is probably not sufficient to account for the gap between observation and model prediction, inviting the reader to perhaps conclude that the models themselves may be somewhat lacking.

The real eye-opener is the Nature paper which Judith talks about here. She points out that, from looking at the two separate graphs of simulated natural plus anthropogenic and natural internal variations only, in comparison with observations (which the former matches closely):

"What is mind blowing is Figure 1b, which gives the POGA C simulations (natural internal variability only). The main ’fingerprint’ of AGW has been the detection of a separation between climate model runs with natural plus anthropogenic forcing, versus natural variability only. The detection of AGW has emerged sometime in the late 1970′s , early 1980′s. Compare the temperature increase between 1975-1998 (main warming period in the latter part of the 20th century) for both POGA H and POGA C:

I’m not sure how good my eyeball estimates are, and you can pick other start/end dates. But no matter what, I am coming up with natural internal variability associated accounting for significantly MORE than half of the observed warming."

So, we have an anthropogenic effect, but it is outpaced by natural changes. As a rough calculation, we are looking at natural exceeding AGW by 40%, which is most definitely not the message that has been consistently drilled into us by the IPCC, who have always maintained that natural forcings are not significant and/or are surpassed by man-made climate change.

So it would appear that, as sceptics have maintained all along, climate changes naturally and these natural changes outweigh any man-made influences upon our climate. In particular, ocean currents seem to be driving these changes over multi-decadal periods. Take another step then and ask the question: what is driving the ocean currents? The Sun, our very own star, that superheated ball of plasma 330,000 times as heavy as Earth, 109 times Earth's diameter, just 93 million miles away, is the most likely candidate. Theodor Landscheidt theorised that solar activity and PDO/ENSO were connected. He also predicted what has become known as the Landscheidt Minimum, the current decrease in solar activity, which is predicted to impact upon our climate in the coming decades by sending world temperatures down. Tallbloke's blog gives further info on Lanscheidt's theories here and here and is well worth taking the time to read. In my opinion, climate science is about to move away from the current obsession with CO2 to a more balanced approach involving a holistic assessment of factors affecting climate variability, with the Sun sitting at the top of a pyramid of spreading influences and AGW confined, at best, to a relegated role sitting on the sidelines, watching the main players.

U.S. and European Union envoys are seeking more clarity from the United Nations on a slowdown in global warming that climate skeptics have cited as a reason not to “panic” about environmental changes, leaked documents show.

They’re requesting that more details on the so-called “hiatus” be included in a key document set to be debated at a UN conference next month that will summarize the latest scientific conclusions on climate change.

Including more information on the hiatus will help officials counter arguments that the slowing pace of global warming in recent years is a sign that the long-term trend may be discounted, according to Bob Ward, policy director at the Grantham Research Institute on Climate Change and the Environment at the London School of Economics.

“In the public debate, there are people who are using the slowdown to say global warming is less of a problem than thought,” Ward said in an interview yesterday. “It has to be fully explained in the summary.”

A draft of the summary and the underlying 2,200-page report from the UN Intergovernmental Panel on Climate Change were obtained by Bloomberg from a person with official access to the documents who declined to be further identified because it hasn’t been published.

Government envoys from around the world will debate the final wording of the summary at an IPCC meeting that starts in Stockholm on Sept. 23. That document, formally the Summary for Policymakers, is designed to be used by ministers working to devise by 2015 a global treaty to curb climate change.

‘Key Issue’

The current version of the summary needs more information about the hiatus, according to the EU and the U.S.

“The recent slowing of the temperature trend is currently a key issue, yet it has not been adequately addressed in the SPM,” the EU said, according to an official paper that includes all governmental comments on the draft report. The U.S. comment suggested “adding information on recent hiatus in global mean air temperature trend.”

Isaac Valero-Ladron, a spokesman for EU Climate Action Commissioner Connie Hedegaard, declined to comment, citing a confidentiality agreement with the IPCC and the lack of a finalized text.

Jonathan Lynn, a spokesman for the UN panel, and Nayyera Haq, a U.S. State Department spokeswoman, both declined to comment.

Addressing the hiatus is important because skeptics of man’s influence on warming the planet have seized on the slowing pace temperature increase as evidence that scientists have exaggerated the impact of manmade greenhouse gases. That supports their assertion that there’s less need for expensive policies to curb carbon emissions from factories, vehicles and deforestation.

Climate Sensitivity

“Some people have suggested that the slowdown means that climate sensitivity is lower,” said Ward from the Grantham Institute.

Climate sensitivity is the increase in temperatures resulting from a doubling of carbon dioxide in the atmosphere. In the latest draft, sensitivity is estimated at 1.5 degrees Celsius (2.7 degrees Fahrenheit) to 4.5 degrees Celsius. That compares with the estimate of 2 degrees to 4.5 degrees from the UN’s last major climate assessment in 2007.

The summary document notes that the rate of warming over the past 15 years “is smaller than the trend since 1951,” citing a rate of about 0.05 degrees Celsius per decade in the years 1998 through 2012. The rate was about 0.12 degrees per decade from 1951 through 2012.

Carbon Emissions

The slowdown came as emissions grew, with the concentration of carbon dioxide in the atmosphere this year exceeding 400 parts per million for the first time on record.

The draft report includes possible reasons for the slowing rate, including natural variability, volcanic eruptions and a drop in solar energy reaching the Earth.

“Much of the information is present but it requires a lot of effort on the part of the reader to piece it all together,” the 28-nation EU said in the comments document.

The U.S. requested clarity on the implications of the data, commenting “this is an example of providing a bunch of numbers, then leave them up in the air without a concrete conclusion.”

Norway, Denmark and China requested information on the role oceans have played in the slowdown. China cited three scientific papers, including a study in the journal Geophysical Research Letters in May that found deep ocean waters below 700 meters (2,300 feet) have absorbed more heat since 1999.

Ocean Temperatures

A separate study in the journal Nature Aug. 28 linked the hiatus to a cooling of surface waters in the eastern Pacific. The cut-off date for papers to be considered in the UN report was March 15.

The UN World Meteorological Organization defines climate as the average weather over a 30-year period, and scientists say the 15-year slowdown isn’t long enough to mark a trend. Hungary and Germany, both EU members, cited this as a reason to delete any reference to the hiatus in the summary, while Japan questioned the purpose of using a 15-year average.

“A 15-years period of observation is not sufficient to give a qualified analysis of the global mean surface temperature trend in an assessment of climate change,” Germany said. It also said the use of the word “hiatus” is “strongly misleading” because “there is not a pause or interruption, but a decrease in the warming trend.”

“We never comment on the internal procedures of the IPCC,” Nikolai Fichtner, a spokesman for the German environment ministry, said in an e-mail.

Slowdown Acknowledged

The slowdown in warming has been acknowledged by the U.K. Met Office, which produces one of the world’s three main series of global temperature data, and James Hansen, the former NASA scientist who first brought climate change to the attention of Congress in the 1980s. They say the data is still compatible with humans being the main cause of warming.

Even with the slowdown, the decade of 2001 to 2010 was the warmest for both hemispheres and for land and sea, the WMO said July 3 in a report. The World Bank says the planet is on course to warm by 4 degrees Celsius by 2100 because of rising emissions.

That hasn’t stopped skeptics, from scientists to lawmakers and bloggers from seizing on the issue.

The Global Warming Policy Foundation, a U.K.-based research group that describes itself as “deeply concerned about the costs” of climate change policies, said in a report in March that “we are on the threshold of global observations becoming incompatible with the consensus theory of climate change.”

Don’t ‘Panic’

The Wall Street Journal published in January 2012 an opinion piece signed by 16 scientists that cited “the lack of global warming for well over 10 years now” as a reason not to “panic” about climate change. They included professors at Princeton and Cambridge universities,Massachusetts Institute of Technology, and former U.S. Senator and Apollo 17 astronaut Harrison Schmitt.

The comments on the slowdown are among 1,855 from governments around the world detailed in the document. The comments range from requests to spell out what acronyms stand for and eliminate scientific jargon to clarifying the likelihood of predictions and shuffling bits of text about.