History and the Limits of the Climate Consensus

Acknowledging the science of global warming does not require accepting that it is immune to criticism.

“Winter Landscape” by Joos de Momper the Younger via Walters Art Museum Wikimedia Commons. The 1620s in the heart of the Little Ice Age.

I began my career as a historian of the century following 1660, an era of harsh climatic conditions that often affected political and cultural history. Some periods in particular, especially the years around 1680 and 1740, stand out as uniquely stressful. Extreme cold led to crop failures and revolts, social crises and apocalyptic movements, high mortality and epidemics, but it also spawned religious revivals and experimentation. If you write history without taking account of such extreme conditions, you are missing a lot of the story. That background gives me an unusual approach to current debates on climate change, and leads me to ask some questions for which I genuinely do not have answers.

I believe strongly in the supremacy of scientific method: science is what scientists do, and if they don’t do it, it’s not real science. Based on that principle, I take very seriously the broad consensus among qualified scientific experts that the world’s temperature is in a serious upward trend, which will have major consequences for most people on the planet—rising sea levels and desertification are two of the obvious impacts. In many religious traditions, activists see campaigns to stem these trends as a moral and theological necessity. Personally, I love the idea of using advanced technology to drive a decisive shift towards renewable energy sources, creating abundant new jobs in the process.

Speaking as a historian, though, I have some problems with defining the limits of our climate consensus, and how these issues are reported in popular media and political debate.

Climate scientists are usually clear in their definitions, but that precision tends to get lost in popular discourse. To say that global warming is a fact does not, standing alone, mean that we have to accept a particular causation of that trend. Following from that, we must acknowledge that the climate has changed quite radically through the millennia, and that equally is beyond dispute. Climate change of some scale has happened, is happening, and will happen, regardless of any human activity. The issue today is identifying and assessing the human role in accelerating that process.

This point comes to mind in popular debate when people who should know better denounce “climate change” as such. See for instance the recent Papal encyclical Laudato Si, with its much-quoted statement that

Climate change is a global problem with grave implications: environmental, social, economic, political and for the distribution of goods. It represents one of the principal challenges facing humanity in our day.

Well, not exactly. “Climate change” is a fact and a reality, rather like the movement of tectonic plates, or indeed like evolution. Particular forms of climate change may be exceedingly harmful and demand intervention, but that is a critical difference. It’s interesting comparing the English phrase “climate change” with the French phrase that was used at the recent COP21 Paris meetings, the Conférence sur les Changements Climatiques 2015: changes, plural, not change. Do you want to see a world without a changing climate? Look at the Moon.

That then gets to the human contribution to current trends. The basic theory in these matters is straightforward, simple, and (rightly) generally accepted. Carbon emissions create a greenhouse effect, which increases planetary temperatures. It should be said, though, that the correlation between emissions and temperatures is none too close. Rising temperatures do not correlate with any degree of neatness to overall levels of emissions. That is especially true when we look at the phenomenal growth in emissions from India and China since the 1980s, which should in theory have caused a global temperature increase far above anything we actually see. Sure, the effects might be delayed, but the correlation is still not working too well.

That disjunction is particularly telling when we look at the very recent era, from 1998 through 2012, when emissions have carried on rising sharply, but temperature rises have been slow or stagnant. This was a hiatus or slowdown in global warming, and it remains controversial. Some recent studies challenge the whole hiatus idea. Others accept the hiatus, but offer different explanations for its cause. Now, the fact that scientists disagree strongly on a topic certainly does not mean that the underlying theory is wrong. Arguing and nitpicking is what scientists are meant to do. But that lack of correlation does raise questions about the assumptions on which any policy should proceed.

That also gets us into areas of expertise. Climate and atmospheric scientists are not only convinced that the present warming trend is happening, but that it is catastrophic and unprecedented. That belief causes some bemusement to historians and archaeologists, who are very well used to quite dramatic climate changes through history, notably the Medieval Warm Period and the succeeding Little Ice Age. That latter era, which prevailed from the 14th century through the 19th, is a well-studied and universally acknowledged fact, and its traumatic effects are often cited. The opening years of that era, in the early-mid 14th century, included some of the worst social disasters and famines in post-Roman Europe, which were in turn followed by the massacre and persecution of dissidents and minorities—Jews in Europe, Christians in the Middle East, heretics and witches in many parts of the world. A cold and hungry world needed scapegoats.

Contemporary scientists tend to dismiss or underplay these past climate cycles, suggesting for instance that the medieval warm period was confined to Europe. Historians, in their turn, are deeply suspicious, and the evidence they cite is hard to dismiss. Do note also that the very substantial Little Ice Age literature certainly does not stem from cranky “climate deniers,” but is absolutely mainstream among historians. Are we seeing a situation where some “qualified and credentialed scientific experts” stand head to head with the “qualified and credentialed social scientific experts” known as historians?

If in fact the medieval world experienced a warming trend comparable to what we are seeing today, albeit without human intervention, that fact does challenge contemporary assumptions. Ditto for the Little Ice Age, which really and genuinely was a global phenomenon. Incidentally, that era involved a drop in temperature of some 2 degrees Celsius, roughly the same as the rise that is projected for coming decades.

The 2015 Paris Conference declared a target of restricting “the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels.” It’s very important to set a baseline for such efforts, certainly, but what on earth is intended here? Which pre-industrial levels are we talking about? The levels of AD 900, of 1150, of 1350, of 1680, of 1740? All those eras were assuredly pre-industrial, but the levels were significantly different in each of those years. Do they want us to return to the temperatures of the Little Ice Age, and even of the depths of cold in the 1680s? The Winter of 1684, for instance, still remains the worst recorded in England, ever. Or are the bureaucrats aiming to get us back to the warmer medieval period, around 1150?

Seriously, does any serious climate scientist claim that “pre-industrial” temperature levels had been broadly constant globally for millennia, in fact since the end of the last true Ice Age some 12,000 years ago, and that they only moved seriously upwards at the start of industrialization? Really? And they would try to defend that? In that case, we should just junk the past few decades of writing on the impact of climate on history, undertaken by first rate scholars.

If pre-industrial temperature levels really varied as much as they actually did, why did the Paris conference so blithely incorporate this meaningless phrase into their final agreement? Did the participants intend “pre-industrial levels” to be roughly equivalent to “the good old days”?

I offer one speculation. Maybe the “Little Ice Age” was the planet’s “new normal,” a natural trend towards a colder world, and we should in theory be living in those conditions today. All that saved us was the post-1800 rise in temperatures caused by massive carbon emissions from an industrializing West. If that’s correct—and I say it without any degree of assurance—then I for one have no wish whatever to return to pre-industrial conditions. Climate scientists, please advise me on that?

Historical approaches are also useful in pointing to the causes of these changes, and therefore of much climate change that originates quite independently of human action. One critical factor is solar activity, and historians usually cite the Maunder Minimum. Between 1645 and 1715, sunspot activity virtually ceased altogether, and that cosmic phenomenon coincided neatly with a major cooling on Earth, which now reached the depths of the Little Ice Age. In fact, we can see this era as an acute ice age within the larger ice age. If you point out that correlation does not of itself indicate causation, you would be quite right. But the correlation is worth investigating.

So do I challenge the global warming consensus? Absolutely not. But that does not mean that all critical questions have been satisfactorily answered, and many of them depend on historical research and analysis. Pace the New York Times and large sections of the media, there is no such thing as “established science,” which is immune to criticism. If it is “established” beyond criticism and questioning, it’s not science.

Hide 63 comments

63 Responses to History and the Limits of the Climate Consensus

Ned , please note the difference between the 24-7 top of the atmosphere solar energy flux, which is a little larger than you state , about 1,300 watts per square meter, and the surface flux which averaged over day, night and latitude is about 240 watts .

As the primary concern is surface temperture rather than the temperature fifty miles up, what matters is radiative forcing at sea level, and aside from changes in aerosols and surface albedo arising from human action, that is accordingly the ~ 1.4 watts/m2 from CO2 , plus the parallel forcings from other emitted gases and their feedbacks , divided by the average surface flux . The forcing is acordingly closer to 1 % than the tenth of a per cent you’d get by dividing by the dayside solar flux on the edge of space.

Hendriksen – “We excavated the middens down to the bottom layers, which date from the time the settlers arrived. The sample we took from the bottom layer of a heap contained cereal grains. The grains had been close to a fire and were charred, which is what preserved them.”

From their shape and size, the grains were positively identified as barley and they came from local agricultural production.

“Wild barley is not strong enough to grow in Greenland”, says Henriksen, who also rules out imported barley, as even small quantities of grain would be too much for the cargo hold of the Vikings’ ships.

“If the cereal grains had been imported, it would have already been threshed, so finding parts of grains of barley is a very strong indication that the Vikings grew their own,” he adds.

Given the Hockey Stick and other statistical attempts to eliminate the Medieval Warm Period, the main point of this article is that there is a historical record that needs to be integrated into models of the climate. This is much like the statistical adjustments that cool the Dust Bowl period and make the warming trend look greater than it really is.

In some ways, the causes of climate change do not matter. If climate change is harmful to humans and bio-diversity, then whether it is due to carbon emissions or not is not particularly relevant. And what is concerning about the climate change being predicted is that it is expected to be unprecedently rapid climate change – too rapid for humans and the eco and biosystem to effectively adapt, resulting in human suffering and irreversible loss of biodiversity (and to me, maintaining biodiversity is an inherent good). Also of concern is that it may be irreversible, as the causes of global warming may lead to a positive feedback system, where warming begets more and faster warming. The argument that the little ice age occurred naturally, and therefore we should be comforted by that, does not hold water for me, since the little ice age sounds like something I would not want to live through. So if we can avoid something similar from happening, we should, even though it’s possible that human activity is not to ‘blame’ for it. And if human industrial activity is to ‘blame’, that does not mean the solution to the issue must be a change in the nature of that industrial activity – not if there is some other cheaper, more efficient solution.

In the long term weather is random. If you average a large number of random weather data together you get a value that will display a Gaussian distribution around a non-random value. For example the average high temperature in January in any Northern location in the US will be reliably colder than the average temperature in July. The standard error of the two averages will be much smaller than the difference between them, making these differences highly significant.

These averages are climate, which unlike weather is quite predictable, we know and plan on it being substantially colder in the winter than in the summer. We know going South for the winter will allow you, on average, to enjoy warmer temperatures. These decisions are made based on knowledge of climate, which is *not* random.

On the other hand, when scheduling outside events far in advance we often will make arrangements for an alternate date in case of rain, because you cannot predict the weather.

It’s not as economical as elsewhere and with today’s cheap transportation it is often cheaper to import. That wouldn’t be the case for the Vikings, so if they were to have agricultural products they would have to growth them themselves. The article I cited shows a field of barley in Alaska, so I would expect you could grow it in Greenland today. The Vikings apparently did.

Greenland was never really green, that was Viking hype. But it was possible to do some agriculture there a millennium ago just as you can today. So what?

TalkingPoints says Global Cooling of the 70’s is “Another myth” then goes on to cite “studies of the literature of the time.” I’m guessing he is referring to modern studies by researchers looking back at what they can dig up about the 1970’s. A better source would be to look at contemporaneous material, not research that may have been undertaken lately by warming alarmists to spin away the embarrassment of Global Cooling from 40 years ago. Then decide what was what. And one might want to give some weight to the account given by a contemporaneous source who doesn’t need the Internet to tell him what he clearly remembers.

I will say, though, and I think this was evident from what I wrote, that I saw the issue as a lay person, not someone particularly up on, nor interested in, the science of Cooling. If (IF) there was a concurrent scientific viewpoint that Cooling was completely wrong and we were in fact Warming, I would say it didn’t make it into the popular literature at the time. And I think the reason that Cooling didn’t catch on then was due to issues more related to politics (a lack thereof) than science. And that Warming today has benefited more from politics… than science.

Mike Alexander raises an important point- that surface temperature varies with latitude as well as time- temperatures fall roughly one degree C for every hundred miles you move north in America

The Bad News is that human action is driving warmer climates towards the poles , dragging ecological change in their wake .

The Interesting News is that this may not be as newsworthy as some would like, because the rate of change , a scant microdegree an hour, means a snail can outrun the rate of temperature change on the ground.

What keeps it interesting is that the melting point of water hasn’t budged- the scant degree of warming since the inception of the Age of Steam has been gnawing inexorable at the frozen Arctic and the bright sea ice whose seasonal growth modulates Earth’s reflectivity and temperture equilibrium.

As with inflation , it’s that rate that creates the symptoms, and conservatives have as much of a duty to steer us clear of runaway climate feedback as hyperinflation .

I remember the seventies and mention of a possible coming ice age. It made for interesting news stories in the popular media. Some scientists initially thought it would turn out that way. However, the majority of scientists in the field were not convinced of or committed to that scenario. Here is an article that conveys this:

Nobody is trying to spin America’s long tradition of wannabe rainmakers , or the 1950’s Collier’s cover headlined ” ARE ATOM BOMBS CHANGING THE WEATHER? ” into watersheds in climatology.

Scarcely a decade passed in the 20th century without some newspaper of record hyping the return of glaciers to Central Park and crocodiles and hippos to the Thames estuary, by fast-forwarding either Milankovic orbital forcing or extrapolating past post-glaciel warming.

What’s new under the sun is that we have since come to accurately measure and monitor solar variability, or the meaningful lack of it , using instruments outside the atmosphere, and have been endowed with enough computational power to manage the geophysical information explosion that has resulted.

Having sorted that out, and the notion of climate forcing by cosmic rays as well, the broad sense of the science is that greenhouse gases remain biggest change in radiative forcing in modern times.

The hard part is quantifying what that means relative to the future behavior of the Earth’s most complex dynamic system.

If you want some sense of how very hard that problem is to deconvolute, I can but offer this article from the American Geophysical Union journal, Earth’s Future

Contemporary scientists tend to dismiss or underplay these past climate cycles, suggesting for instance that the medieval warm period was confined to Europe. Historians, in their turn, are deeply suspicious, and the evidence they cite is hard to dismiss.

Is there historical (or anthropological/archeological) evidence for the medieval warm period from locations outside of Europe?

You raised the objection that these climate cycles might have been regional rather than global but said nothing about why a historian would be “deeply suspicious”.

Would others agree that because global warming (sic) is now suddenly upon us that the massive destruction of forests, habitats and (non-human) species can continue unchecked, and that that oceans are now the de facto rubbish dumps of the world?
It is really strange that this shift in general awareness of our responsibilities to this planet happened in the last 3-5 years. These issues are much more manageable than the climate, and they have been forgotten.