The recent US election has prompted cries that the decision on Earth’s climate has now been irrevocably made, that the US has unilaterally decided to scrap the peak warming target from the Paris agreement of 1.5 oC. What do the numbers say? Is Earth’s climate now irrevocably fracked?

The short answer is that, strictly speaking, the future of global climate would have been fracked even had the election gone the other way, unless stronger action to cut CO2 emissions is taken, very soon.

Here are some numbers. Carbon emissions from the United States have been dropping since the year 2000, more than on-track to meet a target for the year 2020. Perhaps with continued effort and improving technology, emissions might have dropped to below the 2020 target by 2020, let’s say to 5 gigatons of CO2 per year (5000 megatons in the plot). In actuality, now, let’s say that removing restrictions on energy inefficiency and air pollution could potentially lead to US emissions by 2020 of about 7 gigatons of CO2. This assumes that future growth in emissions followed the faster growth rates from the 1990’s.

Maybe neither of these things will happen exactly, but these scenarios give us a high-end estimate for the difference between the two, which comes to about 4 gigatons of CO2 over four years. There will also probably be extra emissions beyond 2020 due to the lost opportunity to decarbonize and streamline the energy system between now and then. Call it 4-6 gigatons of Trump CO2.

This large quantity of gas can be put into the context of what it will take to avoid the peak warming threshold agreed to in Paris. In order to avoid exceeding a very disruptive warming of 1.5 oC with 66% probability, humanity can release approximately 220 gigatons of CO2 after January, 2017 (IPCC Climate Change 2014 Synthesis report, Table 2.2, corrected for emissions since 2011). The 4-6 Gtons of Trump CO2 will not by itself put the world over this threshold. But global CO2 emission rates are now about 36 gigatons of CO2 per year, giving a time horizon of only about six years of business-as-usual (!) before we cross the line, leaving basically no time for screwing around. To reach the catastrophic 2 oC, about 1000 gigatons of CO2 remain (about 20 years of business as usual). Note that these estimates were done before global temperatures spiked since 2014 — we are currently at 1.2 oC! So these temperature boundaries may be closer than was recently thought.

An optimistic hope is that humanity may soon feel the need to clean up the atmosphere by direct CO2 removal. The American Physical Society estimates a cost for this at about $600 per ton of CO2. Based on this the cost of carbon emitted by the US in the next four years would come in at $8-10 trillion, which amounts to about 14% of US GDP over that time. Even under the scenario that lost in the election, $6 trillion of clean-up costs would have been incurred (8% of GDP).

If you are in a new-found panic about the future of Earth’s climate, know that what you’re feeling now would still have been almost as appropriate had the election gone the other way. The fight to defend Earth’s climate would still be just beginning.

Global temperature goes from heat record to heat record, yet the sun is at its dimmest for half a century.

For a while, 2010 was the hottest year on record globally. But then it got overtopped by 2014. And 2014 was beaten again by 2015. And now 2016 is so warm that it is certain to be once again a record year. Three record years in a row – that is unprecedented even in all those decades of global warming.

Strangely, one aspect of this gets barely mentioned: all those heat records occur despite a cold sun (Figs. 1 and 2). The last solar minimum (2008-2010) was the lowest since at least 1950, while the last solar maximum (2013-2015) can hardly be described as such. This is shown, among others, by the sunspot data (Fig. 1) as well as measurements of the solar luminosity from satellites (Fig. 2). Other indicators of solar activity indicate cooling as well (Lockwood and Fröhlich, Proc. Royal Society 2007).

Fig. 1Time evolution of global temperature, CO2 concentration and solar activity.Temperature and CO2 are scaled relative to each other according to the physically expected CO2 effect on climate (i.e. the best estimate of transient climate sensitivity).The amplitude of the solar curve is scaled to correspond to the observed correlation of solar and temperature data.(Details are explained here.) You can generate and adapt this graph to your taste here, where you can also copy a code with which the graph can be embedded as a widget on your own website (as on my home page). Thus it will be automatically updated each year with the latest data.Thanks to our reader Bernd Herd who programmed this.

As climate scientists we are by no means surprised at this development, as there has been clear evidence that the variations of the sun’s activity have played a completely subordinate role in climate change over the last 65 years. We’ve covered this issue many times, e.g. here, here and here. Global warming is driven by greenhouse gases, which is a long-standing consensus in science.

The current IPCC report, for example, limits the natural contribution to global warming since 1950 to less than plus or minus 0.1 ° C (it might have been negative e.g. because of the fading sun). However, some unsupported claims by “climate skeptics” about the importance of solar variability are now clearly falsified.

Climate skeptics have repeatedly predicted an imminent global cooling because of the weak sun. Attributing global warming to the sun has become untenable, because solar activity has not increased for the last 65 years. It has been essentially constant, except for the well-known 11-year Schwabe cycle (which also has little effect on global temperature) and a slight downward trend .

Misunderstood thermal inertia

The excuse of the skeptics here is usually that global warming is a time-delayed reaction to an increase in solar activity before 1950. The basic idea is not entirely wrong: the climate system has a certain inertia. If the solar luminosity were to be increased in a sudden step, the temperature would not rise immediately, as it would take a while to heat up the oceans. This inertia effect can be quantified with the help of model simulations. Caldeira and Myhrvold (ERL 2013) have shown that 60% of the temperature reaction occurs within the first 20 years.

However, around 80% of global warming since the 19th century has only taken place after 1970. It is therefore unthinkable that the slight and gradual increase in solar activity before 1950 could have contributed significantly to the strong warming since the 1970s. Further evidence for this is the comparison of temperatures of land and sea. Everyone knows: when the sun rises in the morning, it takes only a few hours (certainly not decades) to heat the air strongly. Over 90% of the thermal inertia resides in the ocean, while the air over land quickly heats up. If the global warming since 1970 were a delayed response to a previous increase in solar luminosity, then we would now observe above all a catch-up warming of the oceans. The opposite is the case: the continents heat up more quickly and the ocean temperatures are lagging behind.

Another point: even if solar variability, for some magical reason, had a noticeable warming effect over the last decades, this would have to come in addition to the CO2-effect and would not call it into question. The warming effect of CO2 on climate is physically well-understood, and the sensitivity of global temperature to CO2 is independently confirmed by paleoclimatic data, see e.g. Rohling et al. 2012 or the brand-new paper by Friedrich et al. 2016 (here is a nice write-up on this paper from Peter Hannam in the Sydney Morning Herald).

Wrong forecasts

Some “climate skeptics” have been courageous enough to make forecasts. A notable example is former German energy manager Fritz Vahrenholt (who once claimed in an interview that Greenland was nearly free of ice in the Middle Ages). In 2010 Vahrenholt (who was then in a leading position with the energy utility RWE, Europe’s largest CO2 emitter) published a newspaper article with the beautiful sentence:

The winters are becoming noticeably harsher. That worries all those who are concerned about why global warming is obviously pausing.

(Which it was not, but never mind.) He also knew the cause:

Of course, it’s the sun, stupid!

In his 2012 book, Die kalte Sonne (co-written with Sebastian Lüning also from RWE; literally the title translates as The Cold Sun) he then presented his own forecast for the global temperature evolution until 2030. In Figure 3 we compare this to measured data. No comment required.

Figure 3Measurements of global temperature (NASA GISTEMP, moving average over 12 months) compared to the forecast for global temperature by 2030 by Vahrenholt and Lüning, after Figure 73 of their book. (Image by Stefan Rahmstorf, Creative Commons BY-SA 4.0.)

Vahrenholt and Lüning’s book does have one clear merit, however, and that is its title. The Cold Sun nicely sums up the fact that the sun is currently weak – good to know at a time of unprecedented global warming!

p.s. To compare to Vahrenholt’s forecast, here’s a comparison of earlier model projections of global temperature for the IPCC (prediction with the CMIP3 model ensemble used in the 4th IPCC assessment report, published in 2007) with the actual changes in temperature (the four colored curves). Graph by Gavin. (The agreement with the most recent set of models (CMIP5) was recently discussed here by Gavin.)

]]>http://www.realclimate.org/index.php/archives/2016/11/unforced-variations-nov-2016/feed/langswitch_lang/en/145Don’t make a choice that your children will regrethttp://www.realclimate.org/index.php/archives/2016/11/dont-make-a-choice-that-your-children-will-regret/
http://www.realclimate.org/index.php/archives/2016/11/dont-make-a-choice-that-your-children-will-regret/#commentsFri, 04 Nov 2016 19:20:53 +0000http://www.realclimate.org/?p=19691

Dear US voters,

the world is holding its breath. The stakes are high in the upcoming US elections. At stake is a million times more than which email server one candidate used, or how another treated women. The future of humanity will be profoundly affected by your choice, for many generations to come.

The coming four years is the last term during which a US government still has the chance, jointly with the rest of the world, to do what is needed to stop global warming well below 2°C and closer to 1.5°C, as was unanimously decided by 195 nations in the Paris Agreement last December. The total amount of carbon dioxide the world can still emit in order to have at least a 50% chance to stop warming at 1.5 °C will, at the current rate of emissions, be all used up in under ten years! This time can only be stretched out by making emissions fall rapidly.

Meltwater on the Greenland Ice Sheet. Photo with kind permission by Ragnar Axelsson.

In case you have any doubts about the science: in the scientific community there is a long-standing consensus that humans are causing dangerous global warming, reflected in the clear statements of many scientific academies and societies from around the world. None of the 195 governments that signed the Paris Agreement saw any reasons for doubting the underlying scientific facts; doubts about the science that you see in some media are largely manufactured by interest groups trying to fool you.

You have a fateful choice to make. The policies of candidates and parties on climate change could hardly be more different. Hillary Clinton would continue to work with the international community to tackle the global warming crisis and help the transition to modern clean and renewable energies. Donald Trump denies that the problem even exists and has promised to go back to coal and to undo the Paris Agreement, which comes into force today, the 4th of November 2016, as culmination of over twenty years of negotiations.

Please consider this carefully. This is not an election about personalities, it is about policies that will determine our future for a long time to come. While the presidential race has gotten the most attention, voters should consider climate not just at the ‘top of the ticket’, but all the way down the ballot. Don’t make a choice that you, your children and your children’s children will regret forever.

]]>http://www.realclimate.org/index.php/archives/2016/11/dont-make-a-choice-that-your-children-will-regret/feed/langswitch_lang/en/200Carbon storage in WA state forests is too small and too risky to play a serious role as a climate change mitigation toolhttp://www.realclimate.org/index.php/archives/2016/11/carbon-storage-in-wa-state-forests-is-too-small-and-too-risky-to-play-a-serious-role-as-a-climate-change-mitigation-tool/
http://www.realclimate.org/index.php/archives/2016/11/carbon-storage-in-wa-state-forests-is-too-small-and-too-risky-to-play-a-serious-role-as-a-climate-change-mitigation-tool/#commentsFri, 04 Nov 2016 17:35:26 +0000http://www.realclimate.org/?p=19695

guest post by John Crusius, Richard Gammon, and Steve Emerson

The scientific community is almost universally in agreement that climate change (and ocean acidification) are severe threats that demand a rapid response, with putting a price on fossil fuel CO2 emissions being a top priority. Far and away the single biggest contributor to climate change is CO2 emissions from fossil fuel combustion. Indeed, global CO2 emissions from fossil fuel emissions in recent years have been roughly ten times higher than emissions from the next largest global source, land use change, including deforestation (Le Quéré et al., 2015). Despite the small size of carbon fluxes from forests, enhancing carbon storage in forests is often discussed in WA state as a tool to fight climate change. There was one such claim in the Seattle Times OpEd from October 21 by Mathew Randazzo. We challenge these claims that forest carbon sequestration in WA state can significantly help solve climate change. Randazzo does not spell out in any detail what he means. As always, details matter in such discussions, as the science is complex. We focus here on some of the best available science on the climate and carbon storage impacts of forests, and provide references at the bottom of this article from some of the premier scientific journals in the world.

It is easy to understand why many wish carbon storage in WA state forests to be a viable tool to fight climate change, as forestry is an important industry in WA state. Such a solution, at first glance, seems like it could support the local forestry industry and create local jobs. However, mitigating climate change requires responses that make scientific sense. Devoting resources to forest carbon sequestration is largely a distraction from the real work needed to mitigate climate change, which is to reduce emissions of greenhouse gases, most importantly of CO2 from fossil fuel combustion. But before we explain the counterintuitive science, we wish to acknowledge at the start that there are many excellent reasons to support planting trees in WA state and to support the local forestry industry. However, mitigating the threat of climate change is not among those reasons, based on the available science.

In temperate parts of the world (mid to high latitudes), such as the Pacific northwest, the impacts of forests on climate are complex. Forest growth does take up CO2 from the atmosphere, which is the impact on climate many think of. However, forests have other, lesser known impacts on climate as well, including trapping moisture below the forest canopy and altering the way sunlight is reflected off the landscape (termed albedo). In temperate regions such as WA state, forests can actually warm the climate via these impacts on trapping moisture and reflectivity (albedo) more than they cool the climate by taking up CO2. This has been pointed out in a recent article on reforestation and forest management in Europe over the last 250 years that caused a net warming, not a net cooling (Naudts et al, 2016).

It is in the tropical and subtropical latitudes, far south of WA state, where science indicates carbon storage in forests could have the most beneficial effect on the world’s climate and could possibly help to buy time until society reduces fossil fuel emissions substantially (Houghton et al, 2015). Even in the tropics, relying on forest carbon storage is risky. Carbon stores could be re-released back into the atmosphere at any point in response to fire or disease, each of which can be made worse by climate change. Indeed, one recent study of forests in the Amazon region concluded that forests there went from taking up CO2 to releasing it during one dry year (Gatti et al, 2014). Furthermore, there have been suggestions that tropical forest may become a source of CO2, even in the tropics, in response to greater extremes of rainfall (Gatti et al, 2014). In order for carbon storage even in tropical forests to be beneficial, it must remain stored essentially permanently (for many hundreds to thousands of years). No one can guarantee that future climate change, disease, and/or land use change won’t cause release of this forest carbon back into the atmosphere, which would bring us back to the starting point, before any forest carbon storage efforts were even attempted.

It is urgent that society act quickly to minimize the risks posed by both climate change and ocean acidification. However, any solution must stand up to the rigorous test of the best available science. We quote from some journals cited below. “Considering carbon storage on land as a means to ‘offset’ CO2 emissions from burning fossil fuels (an idea with wide currency) is scientifically flawed” (Mackey et al, 2013). “Today’s forest management is more of a gamble than a scientific debate” (Bellassen and Luyssaert, 2014). “Above-ground carbon in forests represents a vulnerable pool of carbon, subject to droughts, fires, insects and other disturbances. Thus, the management of forests to accumulate carbon must not delay or dilute the phasing-out fossil fuel use. On the contrary, the deliberate accumulation of carbon on land may be of little long-term benefit” (Houghton et al, 2015). “Relying on biospheric sequestration is not without risk, because such sequestration is reversible from either climate changes, direct human actions, or a combination of both” (Pan et al, 2011). The best science tells us that relying on storage of carbon in WA state forests is risky at best, and quite possibly counterproductive. It is also in many ways a distraction from the essential efforts to reduce emissions of CO2 from fossil fuels.

There is an interesting news article ($) in Science this week by Paul Voosen on the increasing amount of transparency on climate model tuning. (Full disclosure, I spoke to him a couple of times for this article and I’m working on tuning description paper for the US climate modeling centers). The main points of the article are worth highlighting here, even if a few of the characterizations are slightly off.

The basic thrust of the article is that climate modeling groups are making significant efforts to increase the transparency and availability of model tuning processes for the next round of intercomparisons (CMIP6). This partly stems from a paper from the MPI-Hamburg group (Mauritsen et al, 2012), which was perhaps the first article to concentrate solely on the tuning process and the impact that it has on important behaviour of the model (such as it’s sensitivity to increasing CO2). That isn’t to say that details of tunings were not discussed previously, but the tendency was to describe them briefly in the model description papers (such as Schmidt et al. (2006) for the GISS model). Some discussion has appeared in IPCC reports too (h/t Gareth Jones), but not in much depth. Thus useful information was hard to collate and compare across all model groups, and it turns out that matters.

For instance, if some analyses of the model ensemble tries to weight models based on some their skill compared to observations, it is obviously important to know whether a model group tuned their model to achieve a good result or whether it arose naturally from the the basic physics. In a more general sense this relates to whether “data accommodation” improves a model predictive skill or not. This is quite subtle though – weather forecast models obviously do better if they have initial conditions that are closer to the observations, and one might argue that for particular climate model predictions that are strongly dependent on the base climatology (such as for Arctic sea ice) tuning to the climatology will be worthwhile. The nature of the tuning also matters: allowing an uncertain parameter to vary within reasonable bounds and picking the value that gives the best result, is quite different to inserting completely artificial fluxes to correct for biases. Both have been done historically, but the latter is now much rarer.

A recent summary paper in BAMS (Hourdin et al., 2016) discussed current practices and gave results from a survey of the modeling groups. In that survey, it was almost universal that groups tuned for radiation balance at the top of the atmosphere (usually by adjusting uncertain cloud parameters), but there is a split on pratices like using flux corrections (2/3rds of groups disagreed with that). This figure gives some more details:

Summary results on tuning practices from the survey of CMIP5 modeling groups published in Hourdin et al. (2016).

The Science article though does make some claims that I don’t think are correct. I assume these are statements that are paraphrases from scientists that the writer talked to, but they would have been better as quotes, as opposed to generalisations. For instance, the article claims that

“… climate modelers [will now] openly discuss and document tuning in ways that they had long avoided, fearing criticism by climate skeptics.
…
The taboo reflected fears that climate contrarians would use the practice of tuning to seed doubt about models— and, by extension, the reality of human driven warming. “The community became defensive,” [Bjorn] Stevens says. “It was afraid of talking about things that they thought could be unfairly used against them.”

This is, I think, demonstrably untrue, since tuning has been discussed widely in papers including here on RealClimate. Perhaps it does reflect some people’s opinion, but it is not true generally.

The targets for tuning are vary across groups, and again, it matters which you pick. Tuning to the seasonal cycle, or to the climatological average, or to the variance of some field – which can be well characterised from observations, is different to tuning to a transient change of over time – which is often less well known. Indeed, many groups specifically leave transient changes out of their tuning procedures in order to maintain those trends for out-of-sample evaluation of the model (approximately half the groups according to the Hourdin et al survey).

The article says something a little ambiguous on this:

Indeed, whether climate scientists like to admit it or not, nearly every model has been calibrated precisely to the 20th century climate records—otherwise it would have ended up in the trash. “It’s fair to say all models have tuned it,” says Isaac Held.

Does that mean the global mean surface temperature trends over the 20th Century, or just that some 20th Century data is used? And what does ‘precisely’ mean in this context? The spread of 20th Century trends (1900-1999) in the CMIP5 simulations [0.25,1.17]ºC is clearly too broad to be the result of precisely tuning anything! On a similar issue, the article contains an example of the MPI-Hamburg model being tuned to avoid a 7ºC sensitivity. That is probably justified since there is plenty of evidence to rule out such a high value, but tuning to a specific value (albeit within the nominal range of 2 to 4.5ºC) is not justified. My experience is that most groups do not ‘precisely’ tune their models to 20th Century trends or climate sensitivity, but given this example and the Hourdin results, more clarity on exactly what is done (whether explicitly or implicitly) is needed.

One odd comment relates the UK Met Office/Hadley Centre models:

Proprietary concerns also get in the way. For example, the United Kingdom’s Met Office sells weather forecasts driven by its climate model. Disclosing too much about its code could encourage copycats and jeopardize its business.

It would be worrying if the centers didn’t discuss tuning in the science literature through fear of commercial rivals, and I don’t think this really characterises the Hadley Centre position. Some groups code’s (incl. the Hadley Center) are however restricted for various reasons, though I personally see that as an unsustainable position in the long-term if groups want to partake in international model intercomparisons that will be used for public policy.

The article ends up on an interesting note:

Daniel Williamson, a statistician at the University of Exeter in the United Kingdom, says that centers should submit multiple versions of their models for comparison, each representing a different tuning strategy. The current method obscures uncertainty and inhibits improvement, he says. “Once people start being open, we can do it better.”

I think this is exactly right. We should be using alternate tunings to expand the representation of structural uncertainty in the ensemble, and I hope many of the groups will take this opportunity to do so.

]]>http://www.realclimate.org/index.php/archives/2016/10/tuning-in-to-climate-models/feed/langswitch_lang/en/18Q & A about the Gulf Stream System slowdown and the Atlantic ‘cold blob’http://www.realclimate.org/index.php/archives/2016/10/q-a-about-the-gulf-stream-system-slowdown-and-the-atlantic-cold-blob/
http://www.realclimate.org/index.php/archives/2016/10/q-a-about-the-gulf-stream-system-slowdown-and-the-atlantic-cold-blob/#commentsFri, 14 Oct 2016 11:43:06 +0000http://www.realclimate.org/?p=19653

Last weekend, in Reykjavik the Arctic Circle Assembly was held, the large annual conference on all aspects of the Arctic. A topic of this year was: What’s going on in the North Atlantic? This referred to the conspicuous ‘cold blob’ in the subpolar Atlantic, on which there were lectures and a panel discussion (Reykjavik University had invited me to give one of the talks). Here I want to provide a brief overview of the issues discussed.

What is the ‘cold blob’?

This refers to exceptionally cold water in the subpolar Atlantic south of Greenland. In our paper last year we have shown it like this (see also our RealClimate post about it):

Sometimes the term ‘cold blob’ is not used for this long-term trend but for a recent snapshot: 2015 was the coldest year in this region since records began in 1880 – despite this being globally the warmest year on record. In the ‘cold blob’ discussion, one must thus keep in mind whether talk is about the long-term trend or a short-term anomaly. When we wrote our paper, of course, we did not know that its publication would then coincide with record cold in the area.

You can see the current ‘cold blob’ when looking at maps of the sea surface temperature for example on Climate Reanalyzer.

Fig. 2Anomaly of sea surface temperature (relative to the base period 1971-2000) on 6 October 2016. Source: Climate Reanalyzer

What is the cause of the cold blob?

In principle, there can be two reasons for a change in ocean temperature: heat exchange through the surface or heat transports within the ocean. Halldór Björnsson of the Icelandic weather service showed in his lecture on Saturday that the short-term temperature fluctuations from year to year correlate with the heat exchange through the sea surface, but that this does not explain the longer-term development of the ‘cold blob’ over decades. He concluded that the latter is caused by changes in the North Atlantic ocean circulation, also called the Gulf Stream System. That’s exactly what one expects. Weather dominates the short-term fluctuations, but the ocean currents dominate the long-term development.

One suggestion that had been made some years ago – that the cooling may be caused by shading the sun by aerosol pollution – did not show up in the discussion on Saturday. In the scientific literature that idea was rapidly contradicted at the time, for good reasons (we discussed this in more detail in our paper).

What evidence speaks for a slowdown of the Gulf Stream System?

The basic problem is the lack of direct, continuous measurements of the key circulation in the Atlantic, the so-called AMOC (Atlantic Meridional Overturning Circulation). Such measurements are only available since 2004 through a series of moorings at 26°N (RAPID project). For the longer term development, one must therefore use indirect indicators of the flow.

My colleagues Mihai Dima and Gerrit Lohmann of the Alfred Wegener Institute in Germany in a 2010 study analysed the patterns of changes in global sea surface temperatures. They were the first to conclude that the AMOC has been weakening since the 1930’s. The evidence for this is the trend towards cooling in the subpolar North Atlantic which anti-correlates with temperatures in the South Atlantic (suggesting reduced heat transport from the South Atlantic to the North Atlantic). In addition, Dima and Lohmann found an anti-correlation to the temperatures off the US East Coast, to the south-west of the ‘cold blob’. This is not seen in Fig. 1 above, since the NASA data use a smoothing radius of 1200 km, but you can see it, for example, in the currently high temperatures in Fig. 2.

The latest high resolution simulations of the GFDL in Princeton show precisely this pattern in response to a CO2 increase in the atmosphere (discussed more in this RealClimate post). In the model the cause is a slowdown of the Gulf Stream system. There are also coral data from the Gulf of Maine off the US coast, which indicate a similar time evolution of water mass changes there as the ‘cold blob’ (discussed further in the same post).

Fig. 3Index of the strength of the overturning circulation in the Atlantic (AMOC), calculated from the temperature in the subpolar Atlantic minus the mean temperature of the Northern Hemisphere (red and blue curves). The green curve shows the coral data of Sherwood and colleagues.Source: Rahmstorf et al, Nature Climate Change 2015.

For the most recent past, the Atlantic flow index we calculated from the temperature pattern is consistent with other data. For the time since 2004, for which there are direct measurements of the AMOC from the RAPID array, the downward trend by 3 Sv measured there agrees with our indirect estimate. The significant slowdown after 1970 and then following recovery from about 1990 in our curve has been confirmed by other studies with other methods (see e.g. Haine 2016 and its schematic diagram).

What speaks against a slowdown of the Gulf Stream System?

As a counter-argument against a weakening of the Gulf Stream system, Steingrímur Jónsson on Saturday brought up the measurements of the so-called “overflow” from the Nordic Seas across the sills between Greenland, Iceland and Scotland, which do not show any trend. Here one must simply distinguish between different parts of the Atlantic ocean circulation. In our study, we argue that the AMOC in the open Atlantic has weakened – i.e. to the south of the ‘cold blob’, where the heat comes from. This is what’s measured by the RAPID array. The overflows further north are (i) unlikely to have an influence on the temperatures in the ‘cold blob’, and (ii) are largely independent of the AMOC in the open Atlantic – at least that is suggested by a model simulation of the Max Planck Institute for Meteorology in Hamburg, for which we show a correlation analysis in Fig. 2b in our paper.

Another counterargument (though not brought up in professional discourse but on a “climate skeptics” website) is that the measurements on the Oleander line across the Gulf Stream show no slowdown (Rossby et al. 2014). However, these cover only a 20-year period for which our AMOC index also does not show any slowdown. And as Tal Ezer showed in a study in 2015, these measurements of the Gulf Stream don’t correlate with the AMOC measurements of the RAPID array – which is not surprising because the AMOC is only a minor component of the mainly wind-driven Gulf Stream. Therefore these diverse measurements of other aspects of the complex Atlantic ocean circulation are by no means inconsistent with a general long-term slowdown of the AMOC as proposed by Dima and Lohmann.

In our paper, we argued that the meltwater input from the Greenland ice sheet could play a so far neglected role (but not the main role, as some have misunderstood). The standard IPCC models, for example, have not yet taken into account this meltwater input. A new study by Claus Böning and colleagues (Nature Geoscience in 2016) has specifically studied the effect of added meltwater in a high-resolution ocean model. It was assumed that starting from 1990, the Greenland ice sheet begins to lose mass – initially starting with a melt rate of zero which is linearly increased until 2020. The authors find little influence on the flow – but this is hardly surprising given the design of this experiment. The real Greenland ice has not started to lose mass only since 1990, it takes time until the meltwater spreads and accumulates, and also the ocean circulation will react with delay and inertia. The prescribed cumulative meltwater amount in the model experiment is 7,500 cubic kilometers in the period 1990-2020, over half of which is added over the last ten years, so the AMOC has little time to react. In our paper, based on data from Jason Box from the Geologic Survey of Denmark and Greenland, we estimated that the Greenland ice sheet has already come out of equilibrium since the beginning of the 20th century and has since added about 13,000 cubic kilometers of meltwater to the ocean. The response of the AMOC could therefore be greater than in the model experiment.

In a further experiment, Böning and co-workers showed that a cumulative freshwater volume of ~20,000 cubic kilometers leads to a breakdown of deep convection and a slowdown of the AMOC by 5 Sv within a few years. The main conclusion of their study is not that Greenland melt has no influence on the AMOC, but as the title of the paper says it shows “Emerging impact of Greenland meltwater on deepwater formation in the North Atlantic Ocean”. It thus supports our argument that the contribution of Greenland melt should not be neglected.

Incidentally, the meltwater hardly has a direct cooling effect in the ‘cold blob’ region – its effect is rather via the dilution of the sea water with freshwater, which reduces the density and thereby hinders the sinking of the water which drives the AMOC.

Is the cold bubble caused by humans?

An important question, of course, is whether the changes in the subpolar Atlantic are caused by humans or are part of natural variability. In my opinion, this is a question of the time scale considered: the variations from year to year are obviously dominated by weather, and also decadal variations – such as the warming (probably the increase of the flow) from 1990 to the middle of the 2000s and the subsequent cooling (slowdown of the flow) – are likely to be mainly natural variations. In contrast, the long-term trend of Dima and Lohmann since 1930 is in my view largely anthropogenic. As the proxy reconstruction in our paper shows, it is probably unique in the context of the previous one thousand years. It is also predicted by climate models in response to the rising greenhouse gas content of the atmosphere.

Regarding the 2015 record cold in the subpolar Atlantic, the arguments are like a mirror image of the discussion about the global heat record of 2015. For the latter the question was El Niño or global warming? The answer is the combination of both. If the natural variation goes in the same direction as the human-caused trend, then a new record can be set. If the natural variation goes in the opposite direction, it can over-compensate the climate trend for a while.

The potential impacts are increasingly studied, here just briefly a few examples. Haarsma et al. (2015) argue on the basis of model simulation that the weakening of the Gulf Stream system will in the future be the main cause of changes in the atmospheric summer circulation over Europe. Jackson et al. (2015) found that a slowdown is likely to lead to increased storm activity across Britain and parts of mainland Europe. And a new study by Duchez et al. (2016) connects the ‘cold blob’ in the summer of 2015 to the heatwave across Europe that year, because the cold subpolar Atlantic favors a certain air pressure distribution.

A 25-minute video lecture by myself recorded in Iceland last May.Source: Earth101. This project has many more clips with Gavin, Mike, me and other climate scientists – subscribe to their youtube channel (it’s free) to keep them coming.

Could the AMOC break down entirely?

This risk has been discussed since the 1980s, originally due to paleoclimatic data showing a number of abrupt AMOC changes in the course of Earth’s history. It is now well understood that there is a critical tipping point in the system. How far we are from this, however, is not known. Earlier model intercomparisons suggest that a freshwater flow in the order of 0.1 Sv (the equivalent of 3,000 cubic kilometers per year) could be critical. There are some arguments suggesting that models might systematically overrate the stability of the AMOC, which we summarized in PNAS in 2009. An assessment from 2011, commissioned by the European Environment Agency, concluded that the system may be viewed as more sensitive than suggested by earlier assessments.

Postscript on the Arctic Circle Assembly. On Saturday, UN Secretary General Ban Ki-moon was awarded the Arctic Circle Prize for his long-standing commitment to the successful conclusion of the Paris Climate Agreement. He received standing ovations from the more than thousand conference participants in the overcrowded hall. In his acceptance speech he emphasised the record-breaking speed in which the agreement has been ratified; after taking the final hurdle a few days ago, it will come into force on 4 November! “What was once considered impossible has now become unstoppable” he said. “We must now turn words into deeds and implement the Paris Agreement. We do not have a plan B, since we do not have a planet B.”

Ban also answered audience questions, and when asked to give a few key lessons he has learned during his time in office, he said: There are many key players, just don’t leave it to governments! He called upon civil society and the business world to push for a low-carbon and climate-resilient economy.

For the evening, the President of Iceland, Guðni Jóhannesson, invited some Icelandic government members (including Prime Minister Sigurður Jóhannsson and Foreign Minister Lilja Alfreðsdóttir) and a handful of scientists to his mansion for a dinner in honor of Ban Ki-moon, where we further discussed the dramatic climate changes in the Arctic and the progress of climate policy.

Link

The Icelandic newspaper Morgunbladid has a special issue for the Arctic Circle Assembly with the magnificent photos and two articles by Ragnar Axelsson and a series of interviews (e.g with Mike Mann from page 20 and with me from page 16).

Nature published a great new reconstruction of global temperatures over the past 2 million years today. Snyder (2016) uses 61 temperature reconstructions from 59 globally diverse sediment cores and a correlation structure from model simulations of the last glacial maximum to estimate (with uncertainties) the history of global temperature back through the last few dozen ice ages cycles. There are multiple real things to discuss about this – the methodology, the relatively small number of cores being used (compared to what could have been analyzed), the age modeling etc. – and many interesting applications – constraints on polar amplification, the mid-Pleistocene transition, the duration and nature of previous interglacials – but unfortunately, the bulk of the attention will be paid to a specific (erroneous) claim about Earth System Sensitivity (ESS) that made it into the abstract and was the lead conclusion in the press release.

The paper claims that ESS is ~9ºC and that this implies that the long term committed warming from today’s CO2 levels is a further 3-7ºC. This is simply wrong.

The Snyder (2016) reconstruction of global temperatures compared to Antarctic ice core temperature and CO2, and deep water temperatures.

I recently posted a summary of why you can’t constrain ‘Earth System Sensitivity’ (the long term response of the climate system, including ice sheets, vegetation etc.) just by looking at the regression between the forcing from CO2 (and other greenhouse gases) over the ice age cycles. That regression has been looked at before, and Snyder (2016) updates that with her new (and slightly higher amplitude) temperature reconstruction. Unfortunately, she then associates this regression with the Earth System Sensitivity (which it is not) to get a value of ~9ºC for a doubling of CO2.

In the previous post, I outlined how the combination of carbon cycle feedbacks to the Milankovitch forcing and the climate system response to CO2 gives rise to this correlation and that – by itself – it can’t be used to define the latter term. Furthermore, because the regression is being defined over ice age cycles where the biggest changes are related to the (now disappeared) North American and Fenno-Scandanavian ice sheets, the regression might well be much less for situations where only Greenland and West Antarctica are “in play”.

So, what do we think the ESS is, and how does that impact our view of committed warming today? There have been a few papers on this; Hansen et al (2008), Lunt et al (2010) and Palaeosens (2012). They have focused on looking at the warmer climates of the Cenozoic (the Pliocene, etc.) to avoid the confusion from the response of ice sheets to orbital forcing during the ice age cycles of the Pleistocene, but obviously have significant uncertainties due to less precision about ancient greenhouse gas levels. The ESS values range from about 4.5ºC to 6ºC for a doubling of CO2. One other study came up with substantially higher numbers derived from the PETMPliocene CO2(Pagani et al, 2010), but a number of assumptions made there limit it’s applicability to today’s climate.

The current energy imbalance (just a little less than 1 W/m2) implies that the planet would need to warm by ~ 1 x S / 3.7 ºC to restore an equilibrium, and using the standard climate sensitivity of 3ºC for a doubling of CO2, implies a committed warming of 0.8ºC or so. To leap from that to the claimed 3-7ºC warming requires that the changes in the ice sheets and vegetation arising from the current and (medium term) committed warming would increase the radiative imbalance by another ~2 W/m2 – and that is extreme (for reference, the changes between the last ice age and now from these factors is only about 4 W/m2 (Kohler et al, 2010) with the huge impact of the N. American and European ice sheets included).

Thus, Snyder’s claim of a committed warming of 3-7ºC is based on an incorrect method for defining ESS, an inappropriate application of this to the present and is inconsistent with current estimates of the radiative imbalance and plausible future changes in ice sheets.

For clarification, this is not an argument about what ESS is, nor is it a claim that we should ignore the long-term responses of the system to current forcing, nor does it mean that paleoclimate has nothing to tell us about future changes (au contraire!). The objection I have is only to this very specific calculation and its subsequent application.

I don’t want to speculate on how this situation can have occurred. However the maxim that “extraordinary claims require extraordinary evidence” suggests that it might have behoved the editors to require further checks of such a dramatic statement before going to press. As it is, many will claim that this is yet another example of a high profile journal going for press attention over rigourous science. The net result of the conflicting media reports and criticism will likely be a greater confusion about the relevant science, and an overshadowing of what is at heart a good contribution to understanding climate history and that is a shame.