Paleoclimate: The End of the Holocene

Recently a group of researchers from Harvard and Oregon State University has published the first global temperature reconstruction for the last 11,000 years – that’s the whole Holocene (Marcott et al. 2013). The results are striking and worthy of further discussion, after the authors have already commented on their results in this blog.

A while ago, I discussed here the new, comprehensive climate reconstruction from the PAGES 2k project for the past 2000 years. But what came before that? Does the long-term cooling trend that ruled most of the last two millennia reach even further back into the past?

Over the last decades, numerous researchers have painstakingly collected, analyzed, dated, and calibrated many data series that allow us to reconstruct climate before the age of direct measurements. Such data come e.g. from sediment drilling in the deep sea, from corals, ice cores and other sources. Shaun Marcott and colleagues for the first time assembled 73 such data sets from around the world into a global temperature reconstruction for the Holocene, published in Science. Or strictly speaking, many such reconstructions: they have tried about twenty different averaging methods and also carried out 1,000 Monte Carlo simulations with random errors added to the dating of the individual data series to demonstrate the robustness of their results.

To show the main result straight away, it looks like this:

Figure 1 Blue curve: Global temperature reconstruction from proxy data of Marcott et al, Science 2013. Shown here is the RegEM version – significant differences between the variants with different averaging methods arise only towards the end, where the number of proxy series decreases. This does not matter since the recent temperature evolution is well known from instrumental measurements, shown in red (global temperature from the instrumental HadCRU data). Graph: Klaus Bitterman.

The climate curve looks like a “hump”. At the beginning of the Holocene – after the end of the last Ice Age – global temperature increased, and subsequently it decreased again by 0.7 ° C over the past 5000 years. The well-known transition from the relatively warm Medieval into the “little ice age” turns out to be part of a much longer-term cooling, which ended abruptly with the rapid warming of the 20th Century. Within a hundred years, the cooling of the previous 5000 years was undone. (One result of this is, for example, that the famous iceman ‘Ötzi’, who disappeared under ice 5000 years ago, reappeared in 1991.)

The shape of the curve is probably not surprising to climate scientists as it fits with the forcing due to orbital cycles. Marcott et al. illustrate the orbital forcing with this graphic:

Figure 2 Changes in incoming solar radiation as a function of latitude in December, January and annual average, due to the astronomical Milankovitch cycles (known as orbital forcing). Source: Marcott et al., 2013.

In the bottom panel we see the sunlight averaged over the year, as it depends on time and latitude. It declined strongly in the mid to high latitudes over the Holocene, but increased slightly in the tropics. In the Marcott reconstruction the global temperature curve is dominated primarily by the large temperature changes in northern latitudes (30-90 °N). For this, the middle panel is particularly relevant: the summer maximum of the incoming radiation. That reduces massively during the Holocene – by more than 30 watts per square meter. (For comparison: the anthropogenic carbon dioxide in the atmosphere produces a radiative forcing of about 2 watts per square meter – albeit globally and throughout the year.) The climate system is particularly sensitive to this summer insolation, because it is amplified by the snow- and ice-albedo feedback. That is why in the Milanković theory summer insolation is the determining factor for the ice age cycles – the strong radiation maximum at the beginning of the Holocene is the reason why the ice masses of the last Ice Age disappeared.

However a puzzle remains: climate models don’t seem to get this cooling trend over the last 5,000 years. Maybe they are underestimating the feedbacks that amplify the northern orbital forcing shown in Fig. 2. Or maybe the proxy data do not properly represent the annual mean temperature but have a summer bias – as Fig. 2 shows, it is in summer that the solar radiation has declined so strongly since the mid-Holocene. As Gavin has just explained very nicely: a model-data mismatch is an opportunity to learn something new, but it takes work to find out what it is.

Comparison with the PAGES 2k reconstruction

The data used by Marcott et al. are different from those of the PAGES 2k project (which used land data only) mainly in that they come to 80% from deep-sea sediments. Sediments reach further back in time (far further than just through the Holocene – but that’s another story). Unlike tree-ring data, which are mainly suitable for the last two thousand years and rarely reach further. However, the sediment data have poorer time resolution and do not extend right up to the present, because the surface of the sediment is disturbed when the sediment core is taken. The methods of temperature reconstruction are very different from those used with the land data. For example, in sediment data the concentration of oxygen isotopes or the ratio of magnesium to calcium in the calcite shells of microscopic plankton are used, both of which show a good correlation with the water temperature. Thus each sediment core can be individually calibrated to obtain a temperature time series for each location.

Overall, the new Marcott reconstruction is largely independent of, and nicely complementary to, the PAGES 2k reconstruction: ocean instead of land, completely different methodology. Therefore, a comparison between the two is interesting:Figure 3 The last two thousand years from Figure 1, in comparison to the PAGES 2k reconstruction (green), which was recently described here in detail. Graph: Klaus Bitterman.

As we can see, both reconstruction methods give consistent results. That the evolution of the last one thousand years is virtually identical is, by the way, yet another confirmation of the “hockey stick” by Mann et al. 1999, which is practically identical as well (see graph in my PAGES article).

Is the modern warming unique?

Because of the above-mentioned limitations of sediment cores, the new reconstruction does not reach the present but only goes to 1940, and the number of data curves used already strongly declines before that. (Hence we see the uncertainty range getting wider towards the end and the reconstructions with different averaging methods diverge there – we here show the RegEM method because it deals best with the decreasing data coverage. For a detailed analysis see the article by statistician Grant Foster.) The warming of the 20th Century can only be seen partially – but this is not serious, because this warming is very well documented by weather stations anyway. There can be no doubt about the climatic warming during the 20th Century.

There is a degree of flexibility on how the proxy data (blue) should be joined with the thermometer data (red) – here I’ve done this so that for the period 1000 to 1940 AD the average temperature of the Marcott curve and the PAGES 2k reconstruction are equal. I think this is better than the choice of Marcott et al. (whose paper was published before PAGES 2k) – but this is not important. The relative positioning of the curves makes a difference for whether the temperatures are slightly higher at the end than ever before in the Holocene, or only (as Marcott et al write) higher than during 85% of the Holocene. Let us just say they are roughly as high as during the Holocene optimum: maybe slightly cooler, maybe slightly warmer. This is not critical.

The important point is that the rapid rise in the 20th Century is unique throughout the Holocene. Whether this really is true has been intensively discussed in the blogs after the publication of the Marcott paper. Because the proxy data have only a coarse time resolution – would they have shown it if there had been a similarly rapid warming earlier in the Holocene?

I think for three reasons it is extremely likely that there was not such a rapid warming before:

1. There are a number of high-resolution proxy data series over the Holocene, none of which suggest that there was a previous warming spike as strong as in the 20th Century. Had there been such a global warming before, it would very likely have registered clearly in some of these data series, even if it didn’t show up in the averaged Marcott curve.

2. Grant Foster performed the test and hid some “20th C style” heating spikes in earlier parts of the proxy data to see whether they are revealed by the method of Marcott et al – the answer is a resounding yes, they would show up (albeit attenuated) in the averaged curve, see his article if you are interested in the details. [Update 18 Sept: one of our readers has confirmed this conclusion with a different method (Fourier filtering). Thanks!]

3. Such heating must have a physical basis, and it would have to have quickly disappeared again (would it have lasted, it would be even more evident in the proxy data). There is no evidence in the forcing data that such a climate forcing could have suddenly appeared and disappeared, and I cannot imagine what could have been the mechanism. (A CO2-induced warming would persist until the CO2 concentration decays again over thousands of years – and of course we have good data on the concentration of CO2 and other greenhouse gases for the whole Holocene.)

Conclusion

The curve (or better curves) of Marcott et al. will not be the last word on the global temperature history during the Holocene; like Mann et al. in 1998 it is the opening of the scientific discussion. There will certainly be alternative proposals, and here and there some corrections and improvements. However, I believe that (as was the case with Mann et al. for the last millennium) the basic shape will turn out to be robust: a relatively smooth curve with slow cooling trend lasting millennia from the Holocene optimum to the “little ice age”, mainly driven by the orbital cycles. At the end this cooling trend is abruptly reversed by the modern anthropogenic warming.

The following graph shows the Marcott reconstruction complemented by some context: the warming at the end of the last Ice Age (which 20,000 years ago reached its peak) and a medium projection for the expected warming in the 21st Century if humanity does not quickly reduce greenhouse gas emissions.

Figure 4 Global temperature variation since the last ice age 20,000 years ago, extended until 2100 for a medium emissions scenario with about 3 degrees of global warming. Graph: Jos Hagelaars.

Marcott et al. dryly state about this future prospect:

By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean.

In other words: We are catapulting ourselves way out of the Holocene.

Just looking at the known drivers (climate forcings) and the actual temperature history shows it directly, without need for a climate model: without the increase in greenhouse gases caused by humans, the slow cooling trend would have continued. Thus virtually the entire warming of the 20th Century is due to man. This May, for the first time in at least a million years, the concentration of carbon dioxide in our atmosphere has exceeded the threshold of 400 ppm. If we do not stop this trend very soon, we will not recognize our Earth by the end of this century.

Any idea why error bars are getting substantially larger as present is approached? What’s happened to the Roman and Medieval warm periods? What’s happened to the cooling in between, when 9 m high ice ridges, coming from the Black sea, were towering over the walls of Byzantium (Istambul now)?

@ Bouke
Fig.1 (Climate forcing reconstructions for use in PMIP simulations of the last millennium (v1.0)9(2011)
G. A. Schmidt .) Shows the much larger insolation changes around the eqinoxes than the solstices.(since CE 850)
*
gmd-4-33-2011.pdf

Surely with measurement of temperatures, the error bounds must decrease?

ie. We know current temperatures accurately, but will be unsure of temperatures 10,000 years ago?

[Response: The error bars on the Marcott et al work are because (unusually) there are less records included as you come up to the present. This is related to the difficultly in retrieving and dating the top of ocean sediment cores, while it is relatively easy to date the the termination of the glacial period (and beginning of the Holocene). – gavin]

You say that the increase in uncertainty at the far end of the graph is due to few samples available. The authors have indicated that portion of their study is “not robust”. So why use it?

When the results for the last few decades of a 2 millenial period are uncertain and not robust, including them risks the credibility of the entire study.

[Response: Our opinions on this will obviously differ, but including the later points along with the uncertainties allows people to assess this for themselves. Plus inclusion of data that overlaps in time but does not have the same uncertainty (i.e. PAGES-2K, HadCRUT4) provides some context. Pointing out where parts of a reconstruction have more uncertainties than others adds to its credibility in my opinion. – gavin]

That portion of the data is shown for comparison – of course the instrumental measurements (shown in red in the first graph) are much more reliable for the modern time, as I discuss in the article. -stefan

If you read the post carefully, both the reason for increasing error bars, and “what happened” to the Medieval Warm Period and the Little Ice Age are addressed — and they are in fact present in the graph (Fig. 1) — just not perhaps as you expected to see them, but as consistent with a long term trend.

I ran a Fourier filtering experiment using the Marcott et al reported frequency functions for their processing, and obtain results roughly identical to Grant Foster – any Holocene era temperature changes of a magnitude similar to current warming would indeed have shown up quite clearly in the Marcott analysis. (Not to mention that current warming simply won’t reverse/vanish on the same time-scale as it has ramped up, due to the persistence of CO2…)

Note: this may be a duplicate. I noticed comments making it through moderation time stamped after I’d asked my question. Please delete if my original post actually made it and just hasn’t been moderated yet. Thanks.

If I recall correctly, Marcott using the high frequency noise in the shorter Mann et al 2008 reconstruction to estimate what type of shorter-term variability might be removed over the whole holocene due to coarse proxy resolution. This yields his Figure 3, where they argue that the decade from 2000–2009 has not yet exceeded the warmest temperatures of the early Holocene (5,000 to 10,000 yr B.P.) but it is warmer than 72 percent of the record.

I wonder if you could do a similar analysis, but look at a frequency density function of 100-year (or 50-year) trends instead of looking at a frequency density function of anomalies. Its not perfect, of course, as there is no reason to think that the Mann et al 2008 reconstruction is really characteristic of shorter term (< 400 year) variability throughout the holocene, but the results might be interesting. I strongly suspect they would show the recent period as somewhat unique.

I am a 64 year old man in fair health, with chronic progressive aches and pains. But if I take the average readings of my health from the recent thirty years of my life, one has to say that I am vital and strong. (cough, cough)

And just this morning, I was thrilled to notice the water in my ice cube trays had frozen solid. After I add a little ethanol, I expect to greatly enjoy myself later on. But I dare not use either piece of evidence to prop up a deluded notion that I remain a young man on a cooling planet.

“It appears from the graph (Figure 3 blue line) that the unique rapid rise started in the late 1800′s.
1.) Does this suggest that humans were already putting out enough CO2 to cause global warming at that time?”

Yes, but not enough for the resulting global warming to obviously jump out of the random variations.
For context, here are a couple of graphs featuring relevant estimates:http://cdiac.ornl.gov/trends/emis/graphics/global.total.jpghttp://cdiac.ornl.gov/trends/co2/graphics/lawdome.gif
You should generally not trust random graphs. Even if there’s a .gov in the URL suggesting they’re not simply made up, you shoudn’t trust the impression you get for eyeballing graphs that may have been chosen to produce a particular impression. However, this being realclimate, you may hopefully trust that misleading or badly outdated data will be shot down.

Note that the more CO2 there is in the atmosphere, the less impact every additional mole has.
Also note that deforestation had a large impact on atmospheric CO2 concentrations relative to the burning of fossil fuels before the end of the 19th century.

“2.) Even though the rapid rise is unique, it may have been natural (from the late 1800′s to 1940 – end of the blue line)?”

I’m not crazy about the word “natural” in this context because of the large impact of pre-industrical human activity.

Obviously temperatures were already affected by elevated levels of CO2.
But because there have been a lot of random variations, I’m not sure the circa 1875-1940 global warming was all that rapid or unique unless you consider it as part of the longer industrial-age warming trend which is of course remarkable indeed.
But no doubt other commenters would have something more cogent to say on the matter.

Tad Boyd @13 — Anthropogenic changes to so-called natural climate trends began to be quite sizable with the clearing of forests in Europe and North America for ever increased agricultural fields. Coal, Satan’s cobbles, started being burned in appreciate quantities by 1750 CE and there was already appreciable heavy industry by about 1850 CE.

That the trend in global temperature only becomes positive around 1880 CE is an indication of what is called climate inertia combined with the declining orbital forcing which had to be overcome.

“What’s happened to the Roman and Medieval warm periods? What’s happened to the cooling in between, when 9 m high ice ridges, coming from the Black sea, were towering over the walls of Byzantium (Istambul now)?”
There were no ice ridges. And since the walls of Constantinople were 12 meters in height, a 9-meter ‘ice ridge’ wouldn’t ‘tower over them’. There were recorded episodes of the Marmara Sea icing over in the 1st century CE, the 4th century CE, and a succession of frozen sea winters from 800-1300 – but no 9meter ice ridges. Yavuz et. al. “The Frozen Bosphorus”, 2007.

If you read the post carefully, both the reason for increasing error bars, and “what happened” to the Medieval Warm Period and the Little Ice Age are addressed — and they are in fact present in the graph (Fig. 1) — just not perhaps as you expected to see them, but as consistent with a long term trend.

Yes, it occurred to me before that the Marcott reconstruction kind of puts the MWP into perspective. Prior to the last century it looks like we had a cooling trend of about 5k years, so it would be expected that temperatures in Medieval times would be warmer than in most of the last 700-800 years and, in the absence of modern anthropogenic warming, warmer than today.

So if there was a temporary but slight deviation from the long term trend, and throw in a bit of regional variation, then it could well be the case that NH temps in medieval times were particularly warm, even approaching today’s levels (or exceeding them in some locations). But this would not be evidence that warming to the extent we have seen over the last century occurred naturally at that time without any human influence. I guess to be fair that it also counters the argument that the MWP is evidence for high climate sensitivity.

Surely the one thing this post is at pains to point out is that it does not accurately propose the level at which temperatures began to turn and thus when “the unique rapid rise started” is also not on offer.
However I think your point is that the driver of such a rise would have to be strong enough by 1940 to be responsible for the loin’s share of the 0.35C rise that we know from thermometers had occurred by 1940. (Note that HadCRUT4 shows (as graphed here) declining temperatures 1850-1908 and then a 0.47C rise to 1940.) Perhaps you are also wondering how far back such a driver would have a significant impact on temperature so as to start that “unique rapid rise.”

If C02 is considered as that driver (not a ridiculous idea in principle, although it is more complicated than that as intimated @19) how much forcing and thus how much temperature rise could we expect from our CO2 emissions by, say, 1900 or 1940?
On the back of a fag packet. From the graph presented by Scripps Institute CO2 measured from ice cores had risen 20ppm by 1900 or 16% of the CO2 rise to date. By 1940 35ppm = 30%. The forcing per ppm is greater at lower CO2 levels and the slower rate of CO2 rise would mean more of the forcing was potentially balanced by temperature rise. But, ignoring that, a simple pro rata rise per ppm yields +0.13C (1900) +0.24C (1940).
Again very simplistically, if the wobbles from the Marcott et al reconstruction are seen as a possible size of natural low frequency wobbles of the order of 0.1C peak to peak, the idea that “the unique rapid rise” should have started before 1900 would fit with CO2 as the dominant driver. The size of natural high frequency wobbles may or may not be why HadCRUT4 shows a minimum in 1908. Certainly by 1940 a CO2 forcing would be dwarfing those natural low frequency wobbles and this appearing as a significant feature on a graph of millennial temperatures.

This lengthened reconstruction of global temperatures is no longer forming the shape of a hockey stick. Given hockey sticks are traditionally made from Mulberry and that Mulberry has a characteristic of growing slowly when mature (at least under natural circumstances), would it be fitting to christen the lengthened reconstruction developed here by Marcott et al as the “Mulberry Bough”.
Although myself I would end the trace at 2012 with just a marker to show the 2100 projection, it surely would remain plain to any observer with half a brain that the spurt of growth we see on the end of that Mulberry Bough is not natural.

Are there any Geologists claiming that the planet has been warming for the last 11000 years? Were there ever any Geologists making that Claim? And have they revised their position recently similar to the way the Geological Society revised their position statement on recent warming ?

Found that Gavin had replied to my question but it was sent to the borehole [#1300].

[Response: Other people can chime in – but I think it’s more useful to think of the early 1800’s as being a time when shorter term natural forcings (particularly volcanoes and solar) were pushing climate cooler than the long term trend would imply. Without anthropogenic influences there would have been a recovery of sorts but only to the long term trend – what has actually happened is vastly different. – gavin]

Thanks all for your responses.
I’d had the idea that more recent (than pre-1900) very high human emissions of CO2 drove global warming (the SUV/Commercial air flight era). I was aware of the idea that human activity much earlier than that had started to bring about global warming but did not know if this was generally the view of actual climate scientists. I don’t usually comment because I’m not qualified to make a point, but occasionally do post a question to try to get clarification on what actual climate scientists believe vs. all the other ideas out there and have gotten good direction here at RealClimate (especially from Gavin).

From other commenters that were kind enough to respond to my question I’ve gotten – we (humans) were warming the planet (offsetting cooling) early on (1700’s and 1800’s). Gavin indicates we would have seen a recovery of sorts without human influences but not what has actually happened.

MARodger – You laid out the influences of 20ppm CO2 and 35ppm CO2 increases. (Note – in my reading when Stephan was talking about the unique rapid rise under Figure 3, I was focused on the Marcott et al. blue line and thought that was the context for his description but using HadCRU as the context for the unique rapid rise Stephan is talking about works too as the rapid rise pre 1940 in red, is even more dramatic). Your response helps a great deal to sharpen my clumsy question.

Stefan and/or Gavin – I was assuming that the 1940 and before portion of the rise (Marcot et al. blue line, but HadCRU red line works too) was at least part of what Stefan was referring to as the unique rapid rise but MARodger has indicated my assumption is off in some manner so any instruction regarding that is appreciated. If my assumption is valid in some sense, is the uniqueness of the pre 1940 rapid rise the result of the human activity part of the 20ppm and 35ppm CO2 increases MARodger discusses?

Sorry if MARodger’s response should have sufficed. It is possible I am in the <50% brain category. I do appreciate that actual climate scientists are willing to interact at this level helping separate what the scientists believe vs. the rest.

Introduction to Geological Perspectives of Global Climate Change
Power Point Presentation

Lee C. Gerhard

History has seen many memorable public confrontations between belief systems and science data. Despite the scientific merit of the data, belief systems are powerful endemic and forces against which science must struggle. Some modern examples are evolution and global climate change..

Notes:

Two versions of the presentation are included, one for scientists, without on-slide comment, and one for the lay public with some additional notation.

These presentations will be updated as new information becomes available.

The notes for the scientist version contain many of the base references; a reading list, partly annotated, is appended to this introduction.

In Fig 2 I am having a hard time reconciling Mean Dec and Mean Annual. The Mean June makes perfect sense – high insolation in N. Hemisphere extending to Arctic. The Mean Dec does not show high insolation in S. Hemisphere or Antarctic yet the Annual Mean then shows high insolation in Antarctic.

Also, with the large solar forcing 8-10K ago “more than 30 watts per square meter” (in summer I assume) I would have expected even higher temperatures than modern 20th century. Why is that expectation wrong?

[the UN] Endorses the action of the World Meteorological Organization and the United Nations Environment Programme in jointly establishing an Intergovernmental Panel on Climate Change to provide internationally co-ordinated scientific assessments of the magnitude, timing and potential environmental and socio-economic impact of climate change and realistic response strategies

When I read these two things, I get the urge to add Lee C. Gerhard to my list of intellectually bankrupt people.

Bouke,
Care to explain this extract from the “history” page of the ipcc.ch site.
Nothing about NOT human-induced climate change…

“Today the IPCC’s role is as defined in Principles Governing IPCC Work, “…to assess on a comprehensive, objective, open and transparent basis the scientific, technical and socio-economic information relevant to understanding the scientific basis of risk of human-induced climate change, its potential impacts and options for adaptation and mitigation. IPCC reports should be neutral with respect to policy, although they may need to deal objectively with scientific, technical and socio-economic factors relevant to the application of particular policies.”

[Response: Understanding “the scientific basis of the risk of human-induced climate change” obviously requires knowing the context of natural climate change and variability – explaining the amount of space in the IPCC reports given to issues associated with paleo-climate, ENSO, the AMO, signal-to-noise calculations, detection and attribution etc. This idea that IPCC doesn’t look at internal variability or natural forcings is just bogus. – gavin]

we investigate organic carbon sedimentation patterns and surface water productivity changes in response to Holocene Previous HitclimateTop variability in northern high latitudes. We studied 197 surface sediment samples, 10 short (ca. 10-30 cm long) and 2 long (3-4 m) cores representing the last 10000 years in the western sector of the Barents Sea between Spitsbergen and the Norwegian mainland.

The Barents Sea is a shallow (mean depth ca. 200 m) Arctic Ocean shelf sea which is partially covered by sea ice in the northeast in the winter. Warm, saline Atlantic water (AW) enters the Barents Sea in the southwest and flows east- and northward until it is subducted under cold, fresh Arctic water (ArW) which enters the Barents Sea from the northeast and flows southwestward. The region where the AW is subducted under the ArW is called the Polar Front (PF). Its position is mainly topographically controlled but also depends on the relative strengths of the two water masses. This is also the region of the winter ice margin. Previous research has shown that the Barents Sea is one of the most productive shelf seas in the world.
…
We use OF-Mod 3D, an organic facies modeling software developed by SINTEF Petroleum Research, to reconstruct PP changes and organic carbon accumulation across the western Barents Sea throughout the past 10000 years. OF-Mod 3D is a predictive, process-based, forward-modeling tool used to calculate organic matter preservation in a 3D grid throughout the modeled domain. The surface sample data set is used to calibrate the model to the present-day situation and we will show the spatial distribution of modeled PP throughout the Barents Sea region. We include the position of the PF in the model as a northern boundary for PP during the winter and compare organic matter accumulation in the presence and absence of sea ice.

Keying off 26 and 27:
Gerhard’s attempts a smear with “The mission of the Intergovernmental Panel on Climate Change (IPCC), a United Nations organization, is not to study causes of climate change, but to document only one cause, human impacts on climate.”
In a lay-down, he’s right. Our climate would not have nearly the focus, the research, the reviews, and the controversy, if it was just ‘understanding it all’. But while Gerhard thinks he’s exposing a secret agenda scam, he’s actually admitting that he can’t join the dots. He doesn’t understand that rising GHG pollution is contributing to a disruption of the climate so geologically sudden and ubiquitous in extent, that the parallels best matching it are extinction events. That’s exactly why the IPCC synthesizes a global state of the change every X years. Gerhard displays a kind of Victorian attitude that somehow if the IPCC just stopped talking about, it would be less of a problem.

Gavin,
“This idea that IPCC doesn’t look at internal variability or natural forcings is just bogus.”
Yes it’s bogus. Of course they look at it. Whether or not they FOCUS on it is the question.
What is behind the quoted paragraph when you read it at face value is that this internal variability is just background (proof is, it’s not even mentioned). They say they assess impacts, mitigation and adaptation of HUMAN-induced climate change. Implying that the IPCC is here to confirm a preconceived theory/idea: HUMAN-induced climate change is the most important to care about. IPCC is not here to care about the impacts of natural variability. The scope of work is clear.
I guess it is because internal variability had already been found to have negligible impact compared to human-induced at the time IPCC was created. But then I hear people saying: it’s hard for IPCC people to question the premises on which it has been created.

[Response: This is just very strange. The IPCC is indeed designed to focus on the potential for anthropogenic impacts on climate and I don’t see what is so bad about that. The answer might well have been that there weren’t any important ones. Other review panels (on the Supersonic Stratospheric Transports etc.) come up with not much and are generally set aside – this happens all the time. The IPCC premise is that there is a potential issue worth investigating (a true statement), not that climate sensitivity is some fixed number or that natural variability is negligible. The idea that scientists would work on an assessment that they knew to be pointless is nuts. It’s not like these things are fun. – gavin]

Can anyone point me to articles discussing the accuracy of the various proxies themselves? I apologize for my ignorance on this, but to me it seems our understanding of the relationship between the proxy data and temperature should be calibrated and tested against the observed temperature record. The error bars seem to indicate that we know the exact global temperature within 0.2 degrees C 1000 years ago, but where we have actual temperature data, the error margin is more than a degree (which more than covers the observed warming).

In short, do the error bars indicate that science is 95% confident in the real global temperature to within less than 1 degree C for the last few thousand years, or that it is 95% confident that the proxies, using the best available techniques to calibrate calibrate with global temperature, is accurate to within 1 degree C?

You do present your enquiries in an interesting way, so I don’t know but perhaps this might help. (Note that I am by no means a “climate scientist,” actual or otherwise.)

Because Marcott et al only shows long-term fluctuations, a robust comparison between the blue shoot at the end of the Fig 3 trace and HadCRUT4 is always going to be difficult, especially with the Marcott et al error bars flying off so rapidly. And also note the comment in the post about the “flexibility” of such a “join” although this does not affect the size of the Marcott & the HadCRU rises.
Perhaps to compare apples & apples, HadCRUT4 should be represented far less precisely, say by just 3 x 55-year averages. 1877 -0.32C, 1932 -0.19C, 1987 +0.16C. The BEST land temperature record suggests that HadCRUT4 is not showing all of this “strong rising trend” because it appeared to start earlier than 1850 in the BEST data. So, like the 5 final green dots from PAGE2K, the 1800-1940 0.25C rise in the Marcott et al blue trace is not far from what we see in the thermometer records.

Thus far this analysis is fuzzing out the short term wobbles. But you perhaps allude to wondering how much of the pre-1940 temperature rise (presumably the 1910-40 period) is wobble and how much is “strong rising trend.”

The climate forcings that lie behind the recent wobbles and the “strong rising trend” of HadCRUT4 can be calculated with varying accuracy. The graphs here by Sato et al presents forcings 1880 to date. Note the green plot – this is mostly but by no means all CO2 and that it is not flat back in 1880 (my point @26 with the +20ppm & +35ppm). This green plot is the big positive driver behind the “strong rising trend” but there are also very significant negative drivers.
It is a major task of climatology to establish these forcings with more accuracy and to marry the forcings with the temperature record. While this work remains incomplete, there exists room for speculation about natural oscillations, speculation that deniers love to over-amplify into reasons for ignoring that green plot in the Sato et al graph. But my take-away from the Sato et al graph is that the green plot is a little too scary to be ignored.

For most of the time covered by these graphs, the thermal mechanism would appear to be more-or-less in a steady state, just quietly following various changes in the forcings. We do not appear to be in a steady state at the moment.
If we assume the CO2 concentrations stay at the current levels, and if we ignore any positive feedback from methane and so on, is it possible to calculate what temperature we might get to, given enough time?

Peter Smith #45,
That’s the big question; what the sensitivity metric was created for.
Assuming all forcings (methane, smog and stuff) except CO2 at pre-industrial levels and atomspheric CO2 at the current level, “fast” feedbacks (such as snow cover but excluding the very slow melting of the large ice masses) should get the average global temperature to around 0.52 * sensitivity above the pre-industrial average (if I recall correctly).
Now sensitivity is likely to turn out to be around 2 to 4 centigrades so (ignoring the many caveats) the answer you’re looking for would be between +.7 to +1.8 on the graph at the top of the page.

Re Peter Smith @ #45, it is instructive to look at what earth’s climate was like the last time its atmosphere contained 400 ppm CO2, which would be around 3 million years ago during the mid-Pliocene. Global mean temperature then was 2-3C higher than today (and much higher towards the poles), it’s thought there was a near constant El Nino state, and the West Antarctic ice sheet had collapsed, raising sea level by as much as 25 meters. That said, at about this time the Panama seaway closed, which dramatically changed ocean circulation patterns, and uplift of the Rocky Mountains was also taking place, so the mid-Pliocene is far from a perfect analogy.

All: keep an eye out for Bill Ruddiman’s Earth Transformed, which is supposed to be out in October (after some confusion about dates.) I think this sheds more light on the last ~8,000 years, relevant both to Marcott, et all and PAGES2K. Compared to past interglacials (when well-aligned, which took a lot of work), the usual slow CH4/CO2 downtrends had been slowed well before the Industrial Revolution started.

The cover illustrates the part of the story that comes from Chinese rice paddy archaeology and methane curves compared to other interglacials.

considering that humans have now emitted 550 billion tonnes of co2 to date and 1 trillion tonnes emitted brings about the 2C danger threshold (often spoken about by those concerned about such things)and we emit now around 33 billion tonnes rising at 2-3% per annum means that in just a few decades from now the trillion tonnes would have been passed.

Is it all bad – probably not – but its not all good either and some will definitely suffer from increased flooding and droughts, food shortages and more than likely war as a consequence of such things occurring. It might already be occurring in Syria and Egypt due to food pressures caused by ACC.

the higher the temperature gets the worse it will probably be although I am sure that places such as southern England can continue to reap the rewards of growing pinot noir and chardonnay grapes for sparking fizz.