An Unpublished Law Dome Series

Oxygen isotope series are the backbone of deep-time paleoclimate. The canonical 800,000 year comparison of CO2 and temperature uses O18 values from Vostok, Antarctica to estimate temperature. In deep time, O18 values are a real success story: they clearly show changes from the LGM to the Holocene that cohere with glacial moraines.

On its face, Law Dome, which was screened out by Gergis and Karoly, is an extraordinarily important Holocene site as it is, to my knowledge, the highest-accumulation Holocene site yet known, with accumulation almost 10 times greater than the canonical Vostok site. (Accumulation is directly related to resolution: high accumulation enables high resolution.) The graphic below compares glacier thickness for some prominent sites for three periods: 1500-2000, 1000-1500 and 0-1000. its resolution in the past two millennia is nearly double the resolution of the Greenland GRIP and NGRIP sites that have been the topic of intensive study and publication.

Given the high reliance on O18 series in deep time, one would think that paleoclimatologists would be extremely interested in a publication of the Law Dome O18 data and be pressuring Tas van Ommen on this point.

But despite the apparent opportunity offered by Law Dome, there has been virtually no technical publication of a high-resolution O18 or delD isotope series. In 1997, Morgan and van Ommen here published a sketchy diagram based on a core the upper part of which had some technical problems. Although the core offered more resolution than at other locations where annual dating was done, Morgan and van Ommen only showed results back to 1304. They provided this data to Phil Jones, who used it in Jones et al 1998.

The Australians supplemented the long core with two short cores in 1997 and 1999 but did not publish a 1000-year isotope series.

By 2003, they had calculated a 2000-year series at 4-year intervals, but again did not publish this. The unpublished data was again provided to Phil Jones, who used it in Mann and Jones 2003 and plotted it in Jones and Mann 2004. (It showed surprisingly high values at AD1000 and before.) I sought the data from van Ommen as early as 2003; he put me off saying that he planned to publish these results. In 2006, after putting me off again, van Ommen sent me the data, again expressing his intent to publish the results, but, for one reason or another, he didn’t get around to it.

As I’ve mentioned before, the Law Dome series was discussed by IPCC authors in the preparation of AR4. Their Southern Hemisphere graphic showed two proxies: Cook’s Tasmanian and Oroko Swamp NZ tree ring chronologies. As noted a few days ago, these two proxies are the only two proxies in the medieval portion of the Gergis et al network. So despite its claims to novelty, there is nothing new in its medieval portion.

A Climategate email shows that Phil Jones asked about the omission of the Law Dome series from the IPCC illustration in the AR4 First Draft. I asked the same question about the AR4 Second Draft. They realized that the Law Dome graphic had an elevated medieval period and thus, including it in the graphic would – to borrow a phrase from the preparation of AR3 – would “dilute the message” and perhaps provide “fodder to skeptics”. CRU’s Tim Osborn, expert in such matters, proposed that they discuss Law Dome in the running text (thus providing themselves deniability), but not illustrate Law Dome in the graphic (since a picture was worth a thousand words.) CLA Overpeck endorsed Osborn’s sly ‘solution”, sneering at the supposed lack of expertise at even raising the “ambiguity” in the first place:

Hi Tim, Ricardo and friends – your suggestion to leave the figure unchanged makes sense to me. Of course, we need to discuss the Law Dome ambiguity clearly and BRIEFLY in the text, and also in the response to “expert” review comments (sometimes, it is hard to use that term “expert”…). Ricardo, Tim and Keith – can you take care of this please. Nice resolution, thanks.

In making this proposal, Osborn observed (CG2 3092. 2006-07-18)

(2) Goosse et al. showed Deuterium excess [for Law Dome] as an indicator of Southern Ocean SST (rather than local temperature). Goosse et al. also showed a composite of 4 Antarctic ice core records (3 deuterium, 1 O18). Neither of these comes up to the 20th century making plotting on the same scale as observed temperature rather tricky!

I’d followed Law Dome fairly closely but was unaware of any archived deuterium (delD) or deuterium excess data for Law Dome. Re-examining Goosse et al, 2004 (coauthors of which included AR5 CLA Valerie Masson-Delmotte and her husband as well as Law Dome’s Tas van Ommen) in light of this comment, Goosse et al Figure 3b did indeed contain a graph showing deuterium excess from Law Dome as shown below (green):
Original caption to Goosse et al Figure 3. …(b) [Annual mean temperature averaged over the the region 45–60S, 90–130E in an ensemble of 10 simulations (grey) and their mean (red)] and the deuterium excess measured in the Law Dome ice core, 67S–113E (green curve, left vertical axes).

In their text, Goosse et al described the development of the deuterium data for Law Dome as follows:

In addition to some previously published data sets, water stable isotopes (d18O, dD) measured at approximately annual resolution along the DSS (Dome Summit South) ice core drilled on Law Dome (66.46S, 112.48E, 1390 m above sea level) have been used for model-data comparison. The measurement technique and analytical precision are described in Delmotte et al. [2000] and Masson-Delmotte et al. [2003], but we provide here a longer time series. The deuterium excess is a second order isotopic parameter (d = dD 8*d18O) significantly driven by the sea surface temperature at the oceanic moisture source of the precipitation which, for Law Dome, is mainly located in the Indian sector of the Southern Ocean. If no major change occurs in the location of the moisture source, deuterium excess is thus related to change in sea surface temperature there [Delmotte et al., 2000; Masson-Delmotte et al., 2003].

I recently wrote to Tas van Ommen seeking the unarchived delD data. He said that I would have to get it from its originator, Valerie Masson-Delmotte, who had, in connection with Gergis, expressed her hope that AR5 would be based on publicly available data:

We are aware that not all funding agencies or publishers follow a consistent strategy with regard to the public release of data associated with published articles. Regarding your specific concerns, we are confident that the next draft of our chapter will be based on new publications associated with publicly available datasets.

I haven’t heard from her on the Law Dome deuterium series yet.

But aside from my scuffles with IPCC authors over archiving, I wish to express my surprise over the extraordinarily dilatory publication of the near-time Law Dome isotope data. I chided van Ommen about this – van Ommen is pretty reasonable about answering questions. He explained:

Our research focus has been on trace chemistry work, snowfall rates and also for me an excursion into ice sheet work. This will shortly change with papers either just submitted or about to be in coming weeks. These will benefit from improved dating arising from the trace chemistry studies and annual values back to ca. AD170. As soon as any of the publications are accepted we will be archiving the corresponding data set.

On a personal level, I understand that people can’t do everything in the world. But nonetheless, the deep Law Dome hole was drilled between 1987 and 1992. It provides the highest resolution ice core for the two-millennium period. And remains unpublished. Amazing.

Postscript: Here is a reminder of what the 2003 O18 version looked like. An annual version for two millennia was provided to Gergis (who screened it out.) delD and O18 are closely related and presumably the unarchived del D series will look somewhat similar.

Update June 13: In a 1985 article on an earlier core (also showing elevated MWP values), Morgan commented on potential aliasing of the data by changes in accumulation (and thus elevation.) Slight and feasible increases in elevation could cause the observed secular decline in d18O values. An increase in accumulation would, of course, be of interest for potential sea level changes.

The present day d18O versus elevation relation for the Law Dome is d18O = -0.0054 E(m) – 15.3 (Morgan, 1980) so a 0.1%% aliasing, which is just significant in the d18O profile requires an elevation change of 19 m. If such changes are climatically induced, their effect is mainly to reduce the amplitude of the d18O changes, because an increase in temperature gives increased accumulation which, after some time produces a higher ice cap surface with correspondingly reduced surface temperature.

Update June 13: Deuterium excess data for Law Dome from 890-1841 was archived at Hugues Goosse’s this morning (http://www.elic.ucl.ac.be/repomodx/users/hgs/Law_Dome_deutxs.dat). Here’s what it looked like. The shift around AD1000 in this series is quite pronounced. The Goosse et al 2004 graphic cuts off at AD1000; looking closely it shows a downspike at the start, but it’s a much more prominent feature in the version shown below.

A number of issues are related to the quantitative interpretation of water stable isotopes in relationship with precipitation intermittency and moisture sources, in addition to the direct impact of condensation temperature on distillation.

I’ve asked them to also archive the underlying deuterium series. Also why there are no values after 1841.

Yup, doing a quick visual “categorization” of the individual graphs, I count:

6 show a general warming trend
11 show a general cooling trend
10 show a general flat trend

On RealClimate, one of the moderators admitted today that thier arguments are all based on statistics. (not science)

Q: “how do you know the relatively few proxies used are valid and not just more random noise?”

A: “The answer to the question you are trying to ask is the one I already answered: the mean interseries correlations and their relationship to local temperature are waaaaaaaaaaaaaaaaaay too strong and too frequent to be explained by chance.”

To Just some Guy (12.47: when the RC moderator (‘Jim’) goes on to state that “It’s not even a topic for legitimate consideration frankly; indeed, it’s game-playing–” one’s supsicions may be aroused that an attempt is being made to distract attention from an issue. Jim sounds desperate.

As a newbie seeking knowledge, I am curious as to what comparison was banned. What did Salamano say? Seems this topic has been discussed before? Can someone point out to me where?

At RealClimate, they are convinced that the statistics are on thier side: “correlations and their relationship to local temperature are waaaaaaaaaaaaaaaaaay too strong and too frequent to be explained by chance”.

I am not convinced, especially after looking at the messy, random noise being passed off as data. What is so strong about it? Is there some extra data on local instrumental records we are missing, which perhaps shows an extremely strong correlation to the local proxies?

(Unfortunately, its impossible to have an actual debate at RealClimate because of way the moderators abuse thier power to block comments.)

Unclear what your level of knowledge or interest is. Click my name for general intro. Check “favorite posts” and google “categories” on the sidebar here. True relationship of proxies to local temperatures at another place (Yamal, Siberia) shown here.

Wow that’s alot of very useful information. Looking at the graph called “Salehard + all 12 treering records”, I see no correlation at all! Seems to shoot a very large hole in RC claim that “…their relationship to local temperature are waaaaaaaaaaaaaaaaaay too strong …. to be explained by chance.”

I do, however, like to try and see both sides of the debate. RC is horrible in the way they block contrarian comments, so I have very little success in posting questions over there. However, it appears we do now have a longer answer to the question I asked above in the same thread I linked above.

Comment #125, between multiple sneers, “This is so ludicrous that it hardly deserves mention…you have to close your eyes to the mountains of evidence at these levels of analysis to make those arguments…”, etc blah blah..

We do get some more actual points of argument, “Even with extremely high levels of auto-correlation in each such resulting series (much higher than typically observed), what is the chance of getting this robust mean to correlate at say p = .05 or .10 over a typical 100 year instrumental calibration period, with the climate variable of interest?”

So, I am no statistician, I do not know what p=.05 or .10 means. So I bring two more questions to this blog (if someone would kindly take the time to respond):

Is p=.05-.10 a remarkable correlation which could not be explained by random coincedence?

Are there indeed many examples of trees rings correlations to local temps in the above range?

Its difficult to find the statistical argument supporting the “waaaay” statement (and I looked through the Realclimate post you linked). If one makes a selection to find a correlated variable then bragging about correlations is pretty meaningless.

I’m afraid I stopped posting at Realclimate since there was too much abuse. Incidentally, when I post about things here which are unpopular to the CA crowd (such as data retention for certain experiments) I tend to get debate.

However, for anyone who wishes to do so a good question to ask the Realclimate folk is how much collaboration they have with mainstream statisticians. We (collider particle physicists) regularly have workshops and conferences with statisticians to try to keep abreast of the latest developments and to make sure we’re not making a pigs ear of our data analysis. When was the last such workshop for climatologists ?

“If one makes a selection to find a correlated variable then bragging about correlations is pretty meaningless.”

I think I understand this better now.

I just looked up p value on wikipedia. It’s the probability of obtaining a test statistic at least as extreme as the one that was actually observed. So as I understand it, if one took 100 samples and plotted them against local temps, they should expect to find about 10 of them meeting a p value of .1, or 5 meeting a p value of .05, etc. (please correct me if I have this wrong.)

So I think the “waaaaaaaaaaaaaay” argument is only valid if the number of tests passing the target exceeds the p value, meaning they have actually beaten the odds. (again, correct me someone if I have this wrong.)

I just tried posting a revised question on CA @11:22, we’ll see if it gets blocked / answered.

The p value is the probability based on sampling procedures that the null hypothesis might be able to arrived at observed results. The choice of experimental design, sampling procedures, and null hypothesis is very important to not arrive an apparent statistical rejection of null hypothesis that lacks meaningful scientific value. The convention of using the null hypothesis and p values is already quite imperfect. When procedures includes a step to select samples based on the dependent data, the impact on probabilities is ill-defined.

I can sympathize with both you and the people who collected it. They obviously want to get as much out of the sample as possible. Measuring just about everything, 2H, 18O, CO2, Ar, SO4, NOX, dust, Ca, Na, K and so on will take a long time if you do them on individual slices; but one can have all the parameters together. The problem with most cores is that you cannot plot deuterium and CO2 against each other. The sampling rates of different isotopes, trace chemicals and dust in the published cores are all over the place.
It would be nice to do low resolution 2H/18O measurements along the length and then do very high resolution when there are rapid heating/cooling periods.
If you analyze samples for everything you can think of, but with very high sampling where changes occur, you have the best of all worlds. I don’t think 20 decades to do this is all that long. They probably gave Jones, and you on request, their low sample rate series which constitutes a rough draft.

In most scientific fields, once you get your data, it’s Katy bar the door – a race to publish. In climate science, publish or perish must not hold. I hope there were no graduate students involved in doing those cores. They’d be flipping burgers by now for lack of publications to show for their efforts.

Doing things right is very, very, important. Issuing a rough draft, which has been done, and releasing updated drafts to people to use in bundled studies is fine.
In the backgound you work get the highest resolution in all dimensions that you can.
it is better to publish slowly, and correctly, than shotgun.

DocM: I doubt the work schedule accompanying the grant proposal for Law Dome proposed a two decade time period for completing and all of the archiving basic analyses. (However, additional studies involving high resolution studies of periods of rapid change, new analytical methodology or new questions that might be addressed using the cores certainly could take this long.) The policymakers paying for these studies and making decisions at Kyoto and Copenhagen are being told they can’t wait decades before taking action.

If climate science were a highly competitive area where a low percentage of grant proposals get funding, things would be different. Then, to paraphrase your words: “Getting results that have an impact would be very, very, important”. An “impact” would be definitive conclusions about whether the MWP, the LIA, and the CWP penetrated south to Antarctica. If various proxies disagree (as seems likely), which ones should we believe and why? Other researchers need access to all of the Law Dome data so that they can test their hypotheses about why proxies may disagree. .

It really doesn’t take too much to figure out why the series hasn’t been published.

It would be scientific suicide to any scientist that did publish the result. They would be vilified and at risk of ruining their career.

There would be a hue and cry against them. No matter how careful and exacting their work, it would be labelled rubbish. Even if they could get it published, dozens of counter arguments would be widely published before the ink was dry.

So, when faced with the reality, the result are tucked away in the closet and the politically correct answer is submitted instead.

We like to think that we live in an enlightened age. However, human beings are no different now than the time time when people feared for their lives to publish any hint that the earth was not the center of the universe.

History repeats itself for good reason. Our passions and motivations are largely unchanged by the passage of centuries, thus our actions and the results are largely unchanged.

History never repeats itself, not ever, except when the ‘beholder’ deliberately seeks for readings of the data that suit his or her own interests in the present. These readings tend to tell us more about the beholder’s perspective than the phenomenon beheld (ring any bells?). So for instance, when you look at primary documentary evidence (or historical accounts well grounded in it) you find that the early modern martyrs for scientific thought mostly turn out to have been persecuted for theological positions that most modern people would be surprised to find they held at all, not their views on the disposition of the solar system. If you want lessons from history, look for them in the ways in which we choose to make or accept perceived patterns on the basis of whatever orthodoxies or heterdoxies to which we choose to yoke ourselves. /That’s/ a more reliable constant.

Other way around to oceans – for ice/snow trapped on land cold periods have lighter oxygen, warmer periods are heavier. Law Dome data plotted by Steve suggests cooling over past 1000 years – an inconvenient data-set to be sure if you are looking for a warmer 20th century.

I too would like to understand the justification/reasoning for screening this out. If it has the highest relative resolution and appears to be an excellent proxy then I cannot at this stage understand the decision.

When I look at the C20th warming trends of the 27 proxies, they are all over the place.
My calculations might be rough, but for instance:-
1) Vostok – Same period, same study, two methods. One over 5 times greater than the other.
2) Rarotonga. Two studies that cannot be more than 15km apart. One shows twice the C20th warming rates than the other.
3) Northern Papua New Guinea. Two studies about 150km apart. One shows 2.5 times the C20th warming rates than the other.

I might not be a climate scientist or a statistician, but 27 proxies over nearly 10% of the globe, with massive variations in similar areas would suggest that the screening was somewhat inadequate on such a tiny sample size.

I saw your comment to Andy Revkin on his Karoly-Gergis posting. Did you notice that the item, and the quotes from Retraction Watch, were subtly wrong about the events. They seem to be predicated on Climate Audit’s and Jean S.’s having revealed the problem. Karoly carefully denies that. However, when he is quoted in the Revkin piece,
“This is a normal part of science. The testing of scientific studies through independent analysis of data and methods strengthens the conclusions. In this study, an issue has been identified and the results are being re-checked,”
the “this” referred to is open review. No. He is claiming that Karoly-Gergis et al discovered the problem.

Roun’ these-here parts, that isn’t given much credence, but that is the claim.

This paper had been the whole peer-review route, was ready for publication, and even more significantly: “The study published today in the Journal of Climate will form the Australasian region’s contribution to the 5th IPCC climate change assessment report chapter on past climate.” Every Team and IPCC hurdle had been jumped.

Karoly is holding onto a hand-grenade without the pin. What he has claimed – even in his own dubious version – is that is is a “normal part of science” for the journal peer review process, and the IPCC peer review process, to get it wrong. Think about it.

This is just a technical comment concerning the deuterium excess data shown in the first figure above. It is at annual resolution, has a degree of fine structure and an amplitude of about 2 per mille. What the plot does not show are the measurement error for deuterium excess which is typically between 0.6 and 1 per mille. I don’t have access to the Delmotte et al (2000) paper with technical details in. However, I’m well aware from their later work on EPICA-Dome C in 2004 that the measurement precision for deuterium excess in their lab is on the order of 0.6 to 0.7 per mille. Given this I wouldn’t put too much faith in the small amplitude signal shown in the figure.

I’ll check with the lead author, Liz Thomas, about the full data set. However, I’m more than happy to either email you a copy of the full data set, or better still post it on my web page so you can download it.

P. Solar, the H2O2 data were produced as part of the earlier publication by Liz Thomas on accumulation rates at Gomez and you will do well to ask Liz for that data. I didn’t do the H2O2 analyses. However, in case you don’t I’ve already emailed her and asked for a copy so that I can post it alongside the isotope data. I hope to have it posted for you in the next day or so.

Note the data are not monthly values. The annual means are determined by averaging between the winter-winter minima in H2O2. The minima in H2O2 also generally coincide with minima on the oxygen isotope composition of the ice. More recent years have considerably more points in them due to the compaction of samples with age.

The Gomez H2O2 data are now on the website (url above). The data are in two columns: depth (metres) and H2O2 (ppb). Note the resolution of the H2O2 data is several times greater than that of the oxygen isotopes with some 6,600 H2O2 analyses covering the 134m core length, and in excess of 2000 oxygen isotope analyses.

thanks very much for such a speedy and efficient handling of this request. I see the related H2O2.cvs is now added to the site you referred to.

This certainly makes a refreshing change from the all too frequent situation where requests for the data necessary for verification of a published result are either refused or do not even get the courtesy of a refusal and are quite simply ignored.

This is a world away from “why should I give it to you, you only want to find something wrong with it” as though such a critical verification was not an essential part of scientific validation that ultimately strengths any meaningful results.

If we could move to a world where this was the norm rather than the exception, our understanding of climate would be streets ahead of where we currently are.

Bruce Bauer at NOAA now has the raw data files for Gomez (isotopes and H2O2) and they should be posted at URL: hurricane.ncdc.noaa.gov/pls/paleox/f?p=519:1:3650996493246150::::P1_STUDY_ID:12543 very soon. These will be supplemented by our delta D and deuterium excess data as soon as we have submitted our paper for publication.

I do remember a study not that long ago (Steig e.a.?, But around 2008 see http://nora.nerc.ac.uk/10657/ ) for the past several hundred years, based on several coastal ice cores of Antarctica. These reflect the ocean temperatures of the nearby Southern Oceans, as source of the isotopic changes. The interesting point was that the Peninsula reacted in opposite ways than the rest of the Antarctic coastal ice cores, reflecting changes in atmospheric pressure and/or seawater temperature influenced by the Southern Annular Mode (SAM) and ENSO. Something similar as the NAO which makes that Greenland’s temperatures show an opposite swing as NW Europe.

No, it does not! In the textbooks and several studies you will find that the major drivers for 18O fractination, d18O is the temperature at condensation and the raining out or Rayleigh processes.

So what is that cloud temperature? Isn’t that the dewpoint, which is primairely a function of absolute humidity. And of course, most of the time there is a correlation between temperature and humidity, but not always as in the desert. Th corrolation between humidity/precipitaion and isotope ratios has been reseached by Michiel Helsen

But the rayleigh process may even be a bigger problem, if the prevailing weather patterns change for some reason, as everything was changing between glacials and interglacial periods, the rain out rate could also change considerably.

“On a personal level, I understand that people can’t do everything in the world. But nonetheless, the deep Law Dome hole was drilled between 1987 and 1992. It provides the highest resolution ice core for the two-millennium period. And remains unpublished. Amazing.”

Whilst I agree with your sentiments – in light of its obvious importance, surely the task could at least have been passed on to someone else? – can we not go a stage further and offer to help? Surely crowdsourcing is actually rather a good way of analysing ice cores?

Thanks or the comment from DocMartyn. Also thanks Steve, for recognizing in this forum the value of Law Dome (not to mention acknowledging that I am reasonable!).

I just wanted to draw your readers’ attention to a mis perception. The Law Dome record is not unpublished, or unarchived. Rather, it is published, and archived – the data Illustrated in this post is publicly available (and versions have been so for years). It is progressively improving, and with each stable development in dating and calibration, it gets used and openly archived. As we approach a phase of diminishing refinements, and a converged record with the best dating and calibration, we will of course lead a detailed publication ourselves, but at no stage have we held back on putting out the best snapshots that were available for collaborative use.

Please note that the large task of making the most of a highly resolved core, dating it, not just on the basis of one parameter, but indeed multiple streams of data, sampled many times per year, is a progressive task (over 20 thousand measurements across multiple trace elements, plus the water isotopes, registered in several cores, just to get the last 2000 years). The readers here will appreciate, I’m sure, an effort to ensure maximum quality in dating, measurement and calibration.

I personally want to see the Ar/N2, dust and CO2 levels prior to and through a glacial termination.
The Ar levels should be a proxy for ocean temperature, dust is aerosol cooling and CO2 the signal of biotic productivity and mineralization.
My guess is dust, Ar then CO2.

It may be my stupidity, but I cannot find Law Dome data where I would expect to find it. The website ITASE IceReader has a list of some 200 ice core data sets, of which only about a quarter (various named sites and all the US ITASE sets) have hyperlinks to data sets. These hyperlinks are not available on any of the three listed Law Dome core data sets.

Perhaps you can help? This would seem to be the de facto place to look, certainly it is what Wikipedia appears to recommend.

On the positive side, it is good to see the US (NOAA and NSIDC) taking a clear lead in making data publicly accessible here. I hope this work continues and is emulated by everyone else.

Deuterium excess data for Law Dome from 890-1841 was archived at Hugues Goosse’s this morning (http://www.elic.ucl.ac.be/repomodx/users/hgs/Law_Dome_deutxs.dat). Here’s what it looked like. The shift around AD1000 in this series is quite pronounced. The Goosse et al 2004 graphic cuts off at AD1000; looking closely it shows a downspike at the start, but it’s a much more prominent feature in the version shown below.

A number of issues are related to the quantitative interpretation of water stable isotopes in relationship with precipitation intermittency and moisture sources, in addition to the direct impact of condensation temperature on distillation.

I’ve asked them to also archive the underlying deuterium series. Also why there are no values after 1841.

Let me re-iterate a standing warning to readers not to assume that a given proxy is “RIGHT” because they like the result. Readers are typically quick to spot confounding factors in series with big sticks, as I am myself. But it’s not just Stick series that have confounding factors.

In a 1985 article on an earlier core (also showing elevated MWP values), Morgan commented on potential aliasing of Law Dome d18O data by changes in accumulation (and thus elevation.) Slight and feasible increases in elevation could cause the observed secular decline in d18O values.

The present day d18O versus elevation relation for the Law Dome is d18O = -0.0054 E(m) – 15.3 (Morgan, 1980) so a 0.1%% aliasing, which is just significant in the d18O profile requires an elevation change of 19 m. If such changes are climatically induced, their effect is mainly to reduce the amplitude of the d18O changes, because an increase in temperature gives increased accumulation which, after some time produces a higher ice cap surface with correspondingly reduced surface temperature.

An increase in accumulation would, of course, be of interest for potential sea level changes. Even if the Law Dome d18O series is not a thermometer, it is such an important d18O record that it deserves careful and prompt analysis.

The height of the present collar is the result of many, many centuries of competition between accumulation and deflation (sublimation, plastic flow off the Dome, etc). The collar height cannot initially be assumed constant over the claimed 2,000 years of measurement. At one extreme, it could have been at bedrock.
On the added deuterium excess graph, the dip at year 1000 looks like an unconformity might look, so this places stress on other measurements to show or to discount that possibility.
I’m sorry to ask such amateur questions, but we have little ice here and are not well versed in the literature – but logic is global if ice is not. So yes, this record does need careful & prompt analysis, looking for the unexpected as much as the expected.

Sorry, SMc, but the critical point to the concept of unconformity is that there is a very large time lag between the now-existing lower and upper rocks suites … sufficiently large to have allowed significant surface erosion of the lower rocks prior to the re-start of sedimentation

Many, many examples world-wide: one here will suffice – in central-north Siberia, the now-exposed surface rock suite of the Jurassic are directly overlain on dated Cambrian rocks. Almost the entire of the Palaeozoic was eroded off prior to the start of Jurassic sedimentation. This is an enormous time-gap, and a very obvious unconformity

Geoff is suggesting that the deuterium excess graph may well show a large time lag at the AD1000 level

Sorry, having trouble visualising this. Does that mean that the apparent discontinuity in the graph indicates a missing segment that would sit just before the 1000AD mark? And that the apparent high point at, say, 900AD actually represents the value for 100AD?

Couple of comments – ‘geochemist’, not ‘geologist’ is better. ianl8888, there is no time limit to an unconformity or its related type, the disconformity. All that is needed is that accumulation, often sedimentation, ceased for a recognisable term, with or without erosion of the then top surface. The layers below and above can be angular to each other (makes it easier) or simply missing parts on a parallel sequence. I’m probably referring to the latter. It could arise, for example, though melting or sublimation or any form of loss of ice at 1000 years (in the Law Dome case), or by cessation of snowfall for a while. The importance is that stationarity cannot be assumed; it needs to be sought in all proxies.
If you look at an airborne radar profile of a section of ice near Law Dome, you will see some of the headaches involved in interpretation of a core or a few cores from the region. http://www.antarctica.gov.au/about-us/publications/australian-antarctic-magazine/2006-2010/issue-17-2009/seeing-through-deep-ice I see some lines that are not continuous, but specialist interpretation at much better resolution than in this image is needed.
Law Dome is about 1390 m ASL and about 1200m of core were recovered before silt inclusions were found, when the basal temperature was a mere -7 deg C. The possibility of past melting of ice at Law Dome is sensibly 100%, because there is silty bedrock below that once lay exposed to the weather. We have to establish that the present ice column evolved smoothly and not by a series of fits and starts.
BTW, this basal -7 deg C also asks serious questions about borehole temperature inversion methods.
Look, we’d all appreciate a set of proxy methods that remove doubt about past climate. They don’t exist. We have to do the best we can. At this time, we can’t do much. Even the delta O18 results are shaky. You can’t make a mathematical equation that fits all sizes when isotope fractionation happens at different rates in the source area(s) of evaporation, in the preciptation area in the opposite fractionation sense, when the distance separation of source and precipitation areas is uncertain, when the transport gases might be well mixed or poorly mixed at both ends, when different fractionations happen at different deposition altitudes/temperatures …..
Much more scientific study is in progress, but that means that much more careful audit is needed.

This 1997 report has good background on Law Dome ice core drilling history and in
Fig 7 has a long O18 temp plot going back to the last glacier maximum where the
temperature just plummets. Temperature spike at 1000 AD. Oops. I am developing a lot of respect for the ice guys who
are down in the south pole drillin’ ice.

I noted the following on another thread here at CA but I think it bears repeating here. The authors of the paper linked below were not specifically attempting to show any problems with using O18 ratios in coral proxies for temperature, but in the paper it is obvious to see that the Sr/Ca ratio follows the SST better than the O18 for the coral proxies studied (which were also included in Gergis (2012). The O18 proxies show more of and upward trend in the modern period. The authors attribute the O18 signal in these proxies as being additive for SST and what they refer to as seawater O18. They also evidently have more faith in the Sr/Ca proxy for the measuring the temperature of interest to them. I found this paper quite revealing into Gergis motivations for possibly not using or testing Sr/Ca proxies. I plan to look at these Sr/Ca proxies for the Australasia region – as there are a number listed with data in the NOAA repository. The authors show a good high frequency correlation of the Sr/Ca proxy with SST going back to 1970. That the Sr/Ca proxy gives a lower modern trend in the cases shown in this paper does not demonstrate that it is a better or valid proxy for temperature, it simply shows a difference with O18 proxies. Both proxies could be wrong.

The possible problem with coral proxies with O18 would not translate to ice cores, but ice cores are an accumulation of precipitated snow that was evaporated from water at locations, I think, that are not entirely understood.

Kenneth Fritsch (Jun 13, 2012 at 9:15 AM)
I don’t think it’s fair to state that Gergis et al. did not examine the Australasian Sr/Ca proxies. There are five such in the Neukom&Gergis 2011 listing. While we don’t know which ones were considered (that is, the 62), it seems more probable that the Sr/Ca proxies were considered and screened out.

HaroldW, you may be correct in your surmise but I am currently in the process of looking at these proxies used in Gergis and not used. I’ll have a better picture of what was or was not done by Gergis then.

Kenneth –
I’d be interested in your analysis. My quick check of the N&G inventory showed 30 other proxies likely to have been considered (not counting those which passed screening), taking into consideration that Gergis’s use of Palmyra, Cook Islands, and Antarctica indicates that the selection was not confined to the 0-50S, 110-180E region.] There are 4 others which were further afield. Which still leaves me one short, as there are 35 rejected proxies. Very strange that the rejected proxies weren’t even listed in the SI.

Steve: The proxy box is 90E–140W, 10N–80S) containing 62 monthly–annually resolved climate proxies from approximately 50 sites. I made a list of N+G candidates at http://www.climateaudit.info/data/gergis/gergis_2012_prescreen.csv but only had 57. The list was semimanual and I may have missed some that you spotted. Instead of providing this sort of information in a digital format, they provide it as pdf, and sometimes as photo-pdf. Very annoying.

“Very strange that the rejected proxies weren’t even listed in the SI.”

I found a relatively large number of proxies from Australasia are in the NOAA repository of temperture proxies. I cannot see how Gergis was able to obtain SONDJF data for all the coral proxies so I have been using annual coral proxy for both the Gergis and not Gergis coral proxies for comparison. The GHCN SONDJF and annual temperatures correlate very well for Australasia region. The SONDJF and annual proxy series in Gergis have reasonably good correlations and the trends in the instrumental period match well with the annual trend always being larger.

For the sake of the authors of Gergis what I hope I do not find is non detrended and not Gergis proxy series to instrumental temperature correlations that pass the p=<0.05 selection test. I will be looking at four not Gergis proxies during my next gardening break.

I have not seen an answer to the following question: is there enough carbon dioxide in the ice core layers, and does it diffuse slowly enough through the ice, to make it possible to use either C14/C12 or C13/C12 isotope ratios for the core age calibration? I suspect there is not enough C13 to do radiodating, but perhaps enough for MS analysis?

CO2 migrates from higher to lower levels, not the reverse, as far as it migrates, which is extremely low in the Siple Dome amd other “warm” cores (including Law Dome) and completely unmeasurable in the “cold” cores like Vostok and Dome C. But all ice cores show the same low CO2 levels for the same periods of time. And lucky for land plants, the average CO2 level over land is higher than in the bulk of the atmosphere, at least during the first hours of sunlight during the day…

For recent years, the ice age can be established by simply counting the layers: the snow accumulation at Law Dome is extremely high: 1.2 m ice equivalent per year for the two cores near the summit and 0.6 m ice equivalent for the third core which is taken more downslope (but goes further back in time), when the layers become smaller with depth, H2O2 levels, conductivity and other techniques can be used. The problem is more in the gas age, as that requires modelling how fast air is exchanged between the atmosphere and the depth of the firn, with decreasing pore size, until closing depth. The latter depends of temperature and snow accumulation, which are interdependent.
But several techniques can help, including (bomb) 14CO2, 13CO2, CFC’s, CH4, 15N, 40Ar and other isotopes of gases, partly of recent human origin.
For a background see:

In deep time, O18 values are a real success story: they clearly show changes from the LGM to the Holocene that cohere with glacial moraines.

Don’t get too excited about the precision of this correlation, since part of the Vostok dating works by identifying a wiggle in the d18O curve with a known glacial event. I have no reason to think this isn’t valid, but still it doesn’t provide independent evidence of the ability of ice cores to cohere with glacial events.

Steve: there are dozens of O18 series that clearly distinguish the LGM from the Holocene.

“Steve: there are dozens of O18 series that clearly distinguish the LGM from the Holocene.”

Finding a relatively huge temperature swing related to a glacier event is not the same as accurately finding a degree or less in a temperature series. O18 is much more straight forward and physically understood potential thermometer than other proxies like TRW. I think with ice cores a potential problem would be in knowing the source of the precipitated snow (and O18 ratio) and that the source on average did not change over time.

The deuterium excess is a second order isotopic parameter (d = dD 8*d18O)

I think there must have been a minus sign that got lost somewhere:
d = dD – 8*d18O would make more sense, since according to Eric Steig on an earlier thread, dD is ordinarily about 8 (or 8.5) times d18O, plus a small constant. Any local deviation from this rule then might be of some significance.

Argh Steve there was is an expression that an engineer can do for a penny what any fool can do for a pound.
You have made this expression yours albeit slightly modified.
McIntyre can do for a dollar what any climate scientist can do for a million dollars but he also gets it right. :-)

Steve McIntyre
Posted Jun 13, 2012 at 6:20 AM | Permalink
Fred, I’ve spent quite a bit of time looking at data from Palmyra. It is the location of SB17, the “most influential coral” in the world. SB17 is the primary evidence for the “cool medieval Pacific”.
============
Palmyra is probably not well suited for such studies. It does not have typical Pacific weather.

Palmyra lies very close to the equatorial counter current, which I can attest from personal experience is COLD when you sail over it. Interestingly we caught a baby blue marlin as we sailed over the counter current. One of the mysteries in science is the location of baby blue marlins, where they mature.

Also, Most of the time Palmyra lies under the ICZ with lots of cloud and rain. The ICZ regularly cycles north and south of the Island, which may explain why it is uninhabited.

There is no shortage of freshwater, so that cannot be the cause it is uninhabited. Corals do not like fresh water, which in itself might skew the results, though the island does have a healthy reef. It is heavily shark infested and they are surprisingly aggressive.

It can be very hard to locate the island without GPS because of limited visibility. It very often hidden within a cloud. Thus it is a perfect candidate for “Gilligan’s Island”, lying 1000 miles south of Hawaii.

We have had boats trying for days to locate the island, all the while within 15 miles (VHF radio limits). Other boats have simply disappeared on route, and never heard from again. Kingman reef is 36 miles from Palmyra, which was always a worry before GPS.

… It is entirely under water at high tide, and but a few coral heads project here and there above the surface at low water.

After 140 years of sea level rise:

There are two small strips of dry land composed of coral rubble and giant clamshells on the eastern rim with areas of 2 acres (8,000 m2) and 1 acre (4,000 m2)[4] having a coastline of 3 kilometres (2 mi).[2] The highest point on the reef is less than 5 feet (1.5 m) above sea level,[4] which is wetted or awash most of the time, making Kingman Reef a maritime hazard.

Nice to see the mention of Morgan and van Ommen 1997. I’m not sure how many people read it all the way to end, however, so I thought I’d help out.

“Mapped in this way, the complex temporal nature of the late Holocene fluctuations becomes more apparent. The smoothing emphasizes the multidecadal and longer timescale fluctuations, but significant structure is also seen at shorter timescales if less smoothing is used, albeit in a much noisier record. This complexity of climate anomalies is also seen in other record (Bradley and Jones 1993) and here the seasonal dependence provides a clear indication of why different proxies can tell different stories. These data indicate that for much of this century, there has been significant warming at the DSS site, predominantly during the winter and into the spring seasons. In the average record, however, this effect is considerably offset by the summer cooling in the early part of this century.”

So even though the “red curve” (for lack of a better name) O18 isotope data plot indicates cool conditions at the end of the record, i.e., the past century or so, the seasonal analysis described in this paper indicates, quoting the above, “significant warming at the DSS site”. So the impression received from a first-look examination of that vital figure appears not to match the interpretation provided by a detailed in-depth analysis. Perhaps Tas van Ommen would care to comment — or perhaps we shall await further publications, which would be most welcome to elucidate these points of interest.

The team has 600 dice and rolls them all. Around 100 of the dice rolls come up six’s. (In accordance with natural probability)

The team then bestows these 100 dice which rolled sixes as having special powers to reveal the past. They justify this by citing thier high-level knowledge of the correlation between dices rolled in 20th century and the number 6.

So these special dice, which are now deemed as special, are rolled 100′s of additional times, representing past rolls, with the results plotted on graphs. Since these dice were,in reality, just behaving in accordance with natural probabiity, the additional rolls are just random numbers, averaging about 3.5.

The Team plots a graph, showing the random averages of around 3.5 as historical values, but all the 6′s rolled in the first step are used to show 20th century warming. To further strengthen thier case, un-related actual data from 20th century is grafted at the end.

In some cases, as time goes by and more years pass, they roll the dice some more to check and see if they still get sixes. They don’t, of course, in accordance with natural probability. These results are called the unexplained divergence problem and are hidden from publications (since they are unexplained.)

The dice analogy brings up a point that consistently bothers me when I see the word “significant” used in conjunction with arbitrary levels like 0.05 or 0.01. I once threw 9 sixes with a single, honest die. That has odds of something like 1 in 10 million of occurring by chance. Never the less, such things happen. When a sampling procedure is loaded to select “sensitive” series, hmmm.

there is no time limit to an unconformity or its related type, the disconformity. All that is needed is that accumulation, often sedimentation, ceased for a recognisable term, with or without erosion of the then top surface

I’ve had this discussion many times. The time lag, or cessation of sedimentation, cannot but leave an erosional surface as weathering simply does not cease. Also I did not state that there is a time limit – there is a significant time lag

Whether the Law Dome records show this is in your camp, and I’m interested in the development of your argument here. It goes to the stability of deuterium accumulation

@ianl8888 – See steve’s intro “The present day d18O versus elevation relation for the Law Dome is d18O = -0.0054 E(m) – 15.3 (Morgan, 1980) so a 0.1%% aliasing, which is just significant in the d18O profile requires an elevation change of 19 m.” It’s not only deuterium that can be affected by a – let’s call it “disturbance to the rhythm of annual deposition”. Delta O18 can be also. It’s rather a stretch to assume that in this region of high accumulation, the collar altitude of the drill hole stayed within +/- 20 m of the present elevation.
Suppose there was an unconformity. By eyeball match, to stitch the deuterium graph to a smooth appearance, you’d need to insert a gap of a few hundred years pre year 1,000. Unlikely, so, you don’t do it this way. You look at other components in the ice to see if there are sudden changes in them. Some of this work is in progress, so there’s not much more I can say.

[…] The Law Dome ice core, drilled in Antarctica between 1987 and 1992, is the highest resolution proxy record (Oxygen-18) for paleoclimate. It is so inconvenient that is was “screened out” of the now withdrawn Gergis et al paper which claimed to show rampant Thermageddon occurring in the Australasian temperature record. Apparently such a good record so close to Australia wasn’t as good as a partial one from much further away. […]