I had a quick look at the paper, SI, and the code. What seems to be done this time is that the proxy network of Mann et al (2008) is processed with a slightly modified screening of Mann et al (2008), and then the reconstruction is done with a slightly modified RegEM CFR of Mann et al (2007)! Now to answer the question that seems to be on everyone’s lips: yes, Tiljander series are still used as inverted. This can be seen from the positive screening correlation values reported in the file 1209proxynames.xls. In fact, going quickly through the screening code, it seemed to me that they have really “moved on” from the screening employed in Mann et al (2008): only “two-sided test” is used!

This means that if a proxy has a strong inverted correlation to the (two-pick?) local temperature, it gets picked – no matter what the physical interpretation is! Since RegEM doesn’t care about the sign, it is now really so that the sign does not matter to them anymore. Anything goes!

I notice that the instrumental series is shown without a 95% error interval. Yet the Hadley series shows about +/- 0.1 degrees and it also shows the temp going down at the end. Where is their instrumental series from? What year does it end?

The CRU CI changes as a function of time as I recall. Early in the series there is a some portion of error allocated
to spatial coverage. A couple questions for Jean S and RomanM.

1. Does anyone recall how jones et al calculated the error due to spatial coverage? ( ok maybe I should reread the paper)

2. Does anyone believe a proxy CI of + or minus .5C? or is this just prima facia stupid. I mean seriously a tree ring
estimating temperature down to .5C? especially when the correlations with the instrument record are low and
temp explains at best 40-50% of varience?

3. How many of you read the mail from CRU that threw the BS flag on Mann when he had a CI that was narrower in the proxy period than thermometers were in the instrumented period.

This reminded me about the skewed review process of Juckes et al … it didn’t feel that strange at the time (as the journal was a minor one) as we had already seen what happened with the major ones (Nature, Science). But I think in the light of CRU letters it would be an interesting story to tell to now widely expanded audience. Has anyone posted a short review of the incident, or is someone willing to do that? Willis?

Also this reminded me about Eduardo Zorita, who earned my 110% respect around those times in CA. Not only he was very straightforward and honest with his comments, but he also had courage to tell his opinions here (it would be interesting if he could share the “feedback” he received from the Team because of that.). In my opinion he is a real SCIENTIST with capital letters, and the climate science community has a lot to learn from people like him. I remind you about the fact that he withdrew his name from the final Juckes et al paper, a small thing that tells a lot about his character. All the best to you, Eduardo, if you are reading this!

Would it be possible for all CA authors to move to camirror? Now all seems to be a bit chaotic, with slow CA still receiving new contributions and Steve writing at another (in fact) blog at the same time.

[Jean S: I don’t have rights to do posts in the mirror, and I wanted to get this out before North-America wakes up 😉 It’s perfectly fine with me if Steve wants to move this to the mirror or even delete it.]

I have discovered a novel proxy for your long-term Temperature Anomaly reconstruction efforts. It is pictured here, 1801-2009. These data faithfully reflect the oft-noted 1929 cooling episode. With effort, the series can be extended back to Medieval times.

While RegEM is too subtle to be subject to simple interpretation (its output sometimes appears bizarre to the unsophisticated), visual inspection strongly suggests that my nominee will pass the 1850-1995 screening.

In coming weeks, I shall be submitting similarly insightful proxies for inclusion into your model. To the trained eye, they are surprisingly abundant.

Re: Bishop Hill (#9),
I don’t see there anything related to correcting the orientation of the Tiljander series. They only added a plot, where they didn’t use Tiljander series at all. The actual correction is about “splicing” different steps together in a CPS reconstruction, and presumably they used the Tiljander series wrongly in those updated versions.

More to the point, if Tiljander is used, doesn’t that mean they didn’t truncate before the artificial downtick? I thought that truncating the proxy would remove it from the calibration period otherwise.

Not only that, but the 20th century large “cooling trend” (or “warming trend” as Mann would call it) was postulated by the original authors to be an effect of road building etc affecting lake sediments and nothing to do with temperature…
.
I’d love to hear Prof. Korhola’s response to finding out the proxy has been used incorrectly *AGAIN*!

We have now had decades of trees being used temperature proxies. I assume, since these are world class scientists, that they have ongoing experiments to measure the effects of the environment on trees. You know, plant a bunch of trees and measure everything you can such as: temperature, precipitation, humidity, sunlight, soil composition etc. Do the same with existing trees of various ages and you might have a clue as to how to extract temperature signals or even if it is possible. Long term experiments such as this are not uncommon in science, look at physicists and the time its taken to fund and build the LHC.

Do climate scientists perform any experiments (other than computer models)?

Given that Mann says the “upness” of the Tiljander series doesn’t make any difference to the end result, I don’t suppose it will matter here either. My only question is why, if it doesn’t make a difference to the end result regardless of orientation, it’s included at all!

Can we assume for once that Mann isn’t acting totally out of bad faith, and that if he says he thinks the quantities in his model are scalar rather than vector, then he might be right? Instead of saying ‘that’s not logical’, would anyone like to explain why it should matter?

Re: Dave (#17), Dave, Mann is not saying that the Tiljander series is a vector property. What he is doing is taking a scalar and multiplying by -1 and saying this does not matter. The scalar is sediment x-ray density. The series was drilled, collected and analyzed by a group of scientists who interpreted the data in a particular way, having full regard for the site and it’s local hydrology, geomorphology etc. Along comes Mann and flips this intepretation upside down by multiplying by -1. Thus a cooling trend becomes a warming trend. Neat ‘trick’ but totally erroneous.

Are you writing to Science to point out the error that has not been noticed by nine authors and got through the peer-review process but was spotted by an independent scientist within minutes of publication?

Re: David Jay (#24),
In their next paper they will show the 2000-2009 instrumental data, but they will use a GCM run grafted onto the end in thick red to “hide the decline”. So clever. So many tricks. None of them foolish or incorrect.

I do not understand your final statement: “I’m speechless.”
If the Hockey Team believes that it is perfectly acceptable to adjust raw data up & down to hide certain warm periods and to exaggerate others, then how is inverting raw data any different? A plus here, a minus there, upside down here, truncated there, omit this, cherry pick that — Isn’t that what data manipulators do?

You seem to use vector/scalar the opposite way to how I thought they were: a vector is a magnitude with direction, but a scalar solely has magnitude.

As such, it doesn’t matter whether a scalar quantity is multiplied by +1 or -1 because the answer is the same – or rather, +/-1 is a vector quantity, but if you multiply it by a (directionless)scalar, the sign becomes irrelevant.

To give an example:

Professor Smith hypothesises that the number of goals scored in football matches is directly related to the location of the ground at which the match is played and more northern grounds have more goals. He collects data on the number of goals scored, and relates it to the distance north (+ve) or south (-ve) from Birmingham – a vector quantity.

Doctor Jones then comes along with an alternate theory: that the number of goals simply increases with the difference in longitude from Birmingham. He can legitimately take Prof Smith’s vector data and discard the +ve/-ve.

Re: Dave (#26), Dave I like your example but it is irrelevant in this discussion.

We have a hypothesis that x-ray density is linearly proportional to temperature. Both scalar properties. The constant of proportionality between these two is either positive or negative but cannot be both! Mann has inverted the sign of the constant of proportionality between x-ray density and temperature. i.e. he has changed the nature of the physical relationship between both.

Statistically one might argue that the correlation coefficient between the two hasn’t changed but the physical meaning certainly has.

Re: Dave (#26),
A scientist knowledgeable about lake sediments (Tiljander) collects some data, and interpreting the data states that at that date there, this data says it was warmer at that date. Along comes another person not knowledgeable about that particular physical location but with some statistical routines(Mann) and uses the data in an aggregation of information. The effect of including the data set in question should be to make the overall aggregate warmer at that date. Using the Mann math process inclusion of the Tiljander data makes the overall aggregate cooler at that date. Wrong, right?

Do we know when the paper would have been submitted/accepted in relation to early September when Kaufman et al realised there was an error? I cant see the dates at the website. Some of the SI files have recent creation dates.

He took the proxy data that correlated to temp in a local gridcell (threw out the rest) and regem’d the whole thing on the grid.

This is an absolute mess. How much worse can it get? I mean after throwing out the inconvenient data as before he’s again able to make a hockey stick as before. So I’m pissed off by this horsecrap that passes for science — as before.

Re: Dean P (#32), DeanP that paper was corrected. Michael Mann is using a different algorithm. He either thinks that is algorithm doesn’t do what is claimed, or he has a different interpretation from Tiljander, or something else.

After the “liberation” of information and e-mails from CRU over the past 10 days or so is it safe to consign the “hockey stick” arguments for AGW into the category of scientific fads? Proponents may yet seize on a future TBD argument in an attempt to salvage their reputations but I cannot imagine this line of argument carries any weight amongst those not previously committed to the outcome. Does anyone disagree and if so, why?

I recently returned from a trip to Ruritania. While there, I met with the Abbott of the Jarvykortta Monastery, who told me that the monks hve kept a continuous record of the ice breakup date of the adjacent Jarvykortta River since before 400 AD.

It struck me that this could be a useful proxy for the Northern Hemisphere Temperature Anomaly, so I asked if I might have access to these records, and any other information the monastery might have. The Abbott graciously agreed.

The date of the Jarvykortta River breakup has been as early as February 17 (in 1229 and 1234) and as late as June 20 (in 1967). For graphing purposes, I have taken the 11-year rolling average of this series, much as Tiljander et al (2003) did for their varve X-Ray Density data from Lake Korttajarvi, Finland.

I should stress that it was the Finnish Lake Korttajarvi XRD series (Column C in the Excel file) and not this newly-discovered Ruritanian Jarvykortta River series (Column D) that was used by Mann et al (PNAS, 2008) and now, apparently, by Mann et al (Science, 2009).

As with many proxies, there are some potential complications. The Abbott related the following three points to me:

1. There is a natural hot spring that empties into the Jarvykortta River, about 1 kilometer upstream of the Monastery. Its flow appears relatively constant, year-to-year. This addition of hot water would make the ice on the river break up earlier than would otherwise be the case.

2. Around 1720, a few local farmers began piping some of the hot spring’s output to their homes. As the population of the area grew in the 18th and 19th Centuries, this practice became more widespread. This likely led to increasing delays in the timing of the Spring breakup of the river ice by the monastery.

3. For much of the 20th Century, the nearby town maintained a skating rink for the winter and spring, just upstream of the monastery. Cooling coils were placed in the river to keep the rink ice solid, well into the spring. In the late 1920s/early 1930s, and again in 1967, hockey playoffs went into late May.

Also at BitBucket, I have placed an chart that displays this Jarvykortta River proxy (green line) and the Mann et al (2008) Northern Hemisphere CPS Temperature Anomaly (red line). This JPEG file can be downloaded as Jarvykortta-Ice-Out-Proxy.jpg

It is difficult to determine how the Ice-Out signal should be related to regional warming or cooling. The Abbott suggested that earlier thaws might generally correlate with milder winters, while late thaws could be due to harsh winters–aside from human effects.

However, a computer program that interpreted rising numbers from 1850 to the present as a signal of regional warming would orient the Ice-Out signal such that earlier ice breakups are correlated with colder temperatures.

Since reliance on complex algorithms is a hallmark of sound experimental practices, I have oriented the Ice-Out proxy in the latter fashion.

The pattern that results is certainly interesting.

I hope this contribution of a novel data series proves helpful in interpreting the Lake Korttajarvi X-Ray Density varve record that was assembled by Tiljander and her colleagues.

What would be annoying even without the sloppy execution is that this was published in “Science.” “Science” is considered a top-notch journal which has its pick of papers. Supposedly papers are selected for their wide scientific interest and novelty. How novel is another Mann temperature-proxy retread? I suspect that publishing everything Mann tosses together is their way of saying the scientific establishment won’t be pushed around by a couple of uppity Canadian amateurs.

Regarding the “upside down man”, as Nick’s plot shows, when flipped, the Korttajarvi series
has little impact on the overall reconstructions. Also, the series was not included in the
calibration. Nonetheless, it’s unfortunate that I flipped the Korttajarvi data. We used the
density data as the temperature proxy, as recommended to me by Antii Ojala (co-author of
the original work). It’s weakly inversely related to organic matter content. I should have
used the inverse of density as the temperature proxy. I probably got confused by the fact
that the 20th century shows very high density values and I inadvertently equated that
directly with temperature.

This is new territory for me, but not acknowledging an error might come back to bite us. I
suggest that we nip it in the bud and write a brief update showing the corrected composite
(Nick’s graph) and post it to RealClimate. Do you all agree?

In reply, Jonathan Overpeck admonishes, “D et al – Please write all emails as though they will be made public.”

Looking at the leaked e-mails (search for “upside down”) apparently Darrell Kaufman (not Daniel, sorry) is a co-author of a paper that uses the inverted sseries, and is the particular author who inverted the series. Oddly Mann is not on the distro list but Biffra is. The flipped series is called the Korttajarvi series and the subject of the e-mail is “Arctic2k update?” A plot with the inverted series was generated by Nick McKay. Antii Ojala is “co-author of the original work” and the data is based on Ojala and Tiljander’s work: “I’m also thinking that I should write to Ojala and Tiljander directly to apologize for inadvertently reversing their data.”

Nick McKay:

The Korttajarvi record was oriented in the reconstruction in the way that McIntyre said.
I took a look at the original reference – the temperature proxy we looked at is x-ray
density, which the author interprets to be inversely related to temperature. We had
higher values as warmer in the reconstruction, so it looks to me like we got it wrong,
unless we decided to reinterpret the record which I don’t remember. Darrell, does this
sound right to you?

Jonathan Overpeck:

Since the recon in Science has an error, I think you do need to publish a correction in Science. In that, you can very briefly not[e] it didn’t affect the calibration, nor the final result. I don’t think you have a choice here.

All pre-packaged for release ahead of Copenhagen. They never cared about truth before, why start now? But that Mann would repeat the same shameless action after he’d been caught before either speaks of desperation or a belief that he and the rest of the warmist cabal are unstoppable.

I have been waiting for you guys to get into the code. Steve and the rest, great work. I have now rescued my daughters brain by showing her Ed Bergley Jnr’s interview on Fox News! Now if he is an environmentalist then the AWG lot need to work on their P.R. skills. ( http://video.foxnews.com/12000460/climategate )

It is also interesting to notice that in the Monibiots column in the UK Guardian newspaper, the Ian Plimer’s comments get deleted, deleted. deleted! Monibiots calls for the head of Jones over deleting emails but it would seem he is okay with deleting anything Plimer says! Democracy and AGW science at work!

Jones says the emails were “hacked” to wreck Copenhagen! Bloody excellent timing as far as I cam concerned!

LOL, again. As I recall, Einstein said that insanity was doing the same thing over and over and expecting a different result. I guess he didn’t address the situation where someone does the same thing WRONGLY over and over again and expects a valid result?

Realclimate.org has just put up a data sources page “of data links to sources of temperature and other climate data, codes to process it, model outputs, model codes, reconstructions, paleo-records, the codes involved in reconstructions etc. We have made a start on this on a new Data Sources page, but if anyone has other links that we’ve missed, note them in the comments and we’ll update accordingly. ”

Just a quick note from one of the many lay people who have lurked about CA for quite a time. Thank you…… Thank you for your time/effort for your courage and confidence. I always thought the contrast in tone as well as YOUR straightforward focus on data was evidence enough to suspect what we all now know to be true. I understand this isn’t really “cheering news” but it’s vindication anyway…..you deserve it Steve….again thank you

% Please note that for the case when the intercept of the model equals zero, the
% definition of R-squared and the F-statistic change mathematically. For a linear
% model containing the y-intercept, R-squared refers to the amount of variance around
% the mean of the dependent variable (y) which is explained by the variance around the
% mean of the independent variables (x). For a linear model NOT containing the
% y-intercept, R-squared measures the amount of variance around ZERO of the dependent
% variable which is explained by the variance around ZERO of the independent variable.
% If the same equation for R-squared is used for both with and without a y-intercept
% (namely R-squared = [Sum of Squares of the Regression] / [ Total sum of the squares]),
% then R-squared may be a NEGATIVE value for some data. For this reason,
% this subroutine will calculate R-squared using the total un-corrected sum of the
% squares. In effect, this approach avoids negative R-squares but may lack any
% meaningful interpretation for the “goodness-of-fit” in the model. It has been
% suggested by some texts that a more useful approach is to always use the case
% where y-intercept is included in the model. However, as with all statistical
% analyses, it is prudent to simply be aware of what your data “looks” like and
% what your statistical tools are actually measuring in order to generate a useful
% analysis.
%

Nice that this approach avoids “negative R-squares but may lack any
% meaningful interpretation for the “goodness-of-fit” in the model. And “it is prudent
to simply be aware of what your data “looks” like and
% what your statistical tools are actually measuring in order to generate a useful
% analysis.” Well, we already know what our data looks like, don’t we.

I have read the article based on the Mann out. I like post very much as it contain graphs and their economy plans which will work in Synchronisation.I agree with the coding scenario of the system to be followed.I agree with the one of the review that Early in the series there is a some portion of error allocated to spatial coverage.

Steve, in this context, can you please tell me what happened with the Darrell Kaufman graph from the sediment cores at the Korttajaervi lake @ Finland, which Kaufman has turned upside down to make it look like Mann’s hockeystick?

@ Dave: you are in a bit of a tangle with “You seem to use vector/scalar the opposite way to how I thought they were: a vector is a magnitude with direction, but a scalar solely has magnitude.”

First, you are right that a scalar does not have direction, but that does not preclude it having a sign. Consider temperature: you can have plus 80F, and minus 20F, can’t you? What you can’t have is 80F north-west.

Secondly, a vector is not defined merely by having magnitude and direction; a vector has magnitude and direction AND obeys the laws of vector algebra. So, for example, displacements are vectors but finite rotations aren’t. And neither temperatures nor densities are. On these matters there is no consensus, spurious or otherwise – there is genuine 100% agreement!

Anyone else suspicious about the timing of Google’s controversy over the Obama picture? Google is suddenly seen as a bastion of free speech despite critisism about the lack of links to climtegate.

Wikileaks also released the 9/11 texts which provided a great advertisment for their own uncritical view of AGW on their climtegate page just a link click away.

Both internet story bombshells provided a great distraction from the greatest internet scandal of all time. The media went with the Google and Wiki stories next day, while climategate disappeared.

[Jean S: Please, try to keep this discussion even somehow related to the topic. From now on, I will snip everything I think is not even remotely connected to the topic. I wish other moderators do the same.]

Re: Rob (#58),
I don’t know I haven’t gone through the code that carefully yet. I would expect so as the screening procedure, judging from the code, seems to have changed now. On the other hand, there is again a typical mannian thing: the Table S1 is exactly the same as the correponding one reported for Mann et al (2008) indicating that the screening has not changed (or remarkably, not a single series is added/removed because of the change of the screening method)…

Re: Rob (#60),
The S1 table is in the SOM text available here:http://www.sciencemag.org/cgi/content/full/326/5957/1256/DC1
SOM data is also here:http://www.meteo.psu.edu/~mann/supplements/MultiproxySpatial09/
Reading the text again, it is the same table! Also some figures are recycled from Mann et al (2008). Amazing! They changed the screening (one-sided tests were disregarded) and the critical values were raised (Mann et al (2008) had 0.106 for one-sided and 0.136 for two-sided, two-sided in Mann et al (2009) has 0.162), yet they are implicitly claiming that exactly the same number of proxies get selected! Moreover, the SOM text says (S1 is, of course, Mann et al (2008)):

Separate experiments were performed using a “screened” subset of the full proxy data set in which proxy records were screened for a local temperature signal based on their correlations with co-located instrumental data. These and other details, including sources, of the proxy data are provided in ref. S1 [note: a recent correction was made to the details of the screening as described in ref. S1. Due to an “off-by-one” error in the degrees of freedom employed in the original screening that has been brought to our attention, the critical p values used for screening decadally-resolved proxy data are actually in the range p=0.11-0.12 rather than the nominal p=0.10 critical value cited. This brings the critical p value closer to the effective p value used for annually-resolved proxies (nominal value of p=0.10, but effective value actually closer to p=0.13 owing to the existence of significant serial correlation in many of the annual proxy data). It is worth noting that the precise thresholds used in the screening are subjective and therefore somewhat immaterial—our use of statistical validation exercises provides the best test of the reliability of any data screening exercises.

Immaterial?! Those values are BTW for one-sided tests, which, as said, are not used in this new paper!

I take it English isn’t your first language? The proper term is “overlooked”. But it’d be easy to confuse for a non-native English speaker as “to see” and “to look” are closely related words in terms of meaning. To keep Jean happy I’ll point out that the team tries to oversee journal output while overlooking flaws in the articles.

The use by Mann et al (2008) of the Tiljander varve proxies is opaque to most outsiders at first. “Mann published in a prestige journal, McIntyre complained, Mann responded. Consensus vs. Denialists, RealClimate vs. ClimateAudit, bla bla bla. Who knows and who cares.”

On closer inspection–meaning reading the PNAS paper and Supplement, pulling the Tiljander et al (2003) PDF–it becomes a simple story. Mann’s algorithm inadvertantly flipped two of the Tiljander proxies so that Cold was interpreted as Warm and vice versa.

Whoops. Mann made a mistake, McIntyre was correct.

In most other fields, this would have ended with Mann’s group promptly acknowledging the error, re-doing the affected calculations, and submitting a formal Correction and the revised figures to the journal.

Hasn’t happened.

Instead, a kaleidoscope of explanations, excuses, and Who’s-On-First? routines by AGW-Consensusists was set off by Mann’s Delphic response in PNAS: that McIntyre’s claim of an error was “bizarre.” The spectacle continues to this day.

I’ve read that:

Mann was right–The proxies can’t be upside-down–It’s too complex for you to understand–Mann interpreted the proxies correctly, as did Tiljander and Kaufman–There is no single “right” interpretation–The alleged mistake doesn’t matter anyway–Look over there, a pony!–You have to be super-knowledgable about varves to earn the right to have an opinion–The proxy data are interpreted by computer, so orientation doesn’t matter–Figure S8a proves that Mann was correct–Ah, grasshopper, what is this notion of “correct,” anyway?–Don’t you have anything better to do with your life?–Mann’s use of the proxies is consistent with Tiljander’s interpretation–Mann’s use need not be bound by Tiljander’s interpretation–Your simplistic attempts to negate the power of statistical algorithms don’t fool anyone.

I probably missed a few.

You will have noted that, each year, the date of the breakup of the ice on the Jarvykortta River shows an unusual correspondence with the X-Ray Density greyscale value that Tiljander recorded for the Lake Korttajarvi varves.

The two data sets tracks correspond with each other in other ways as well.

Mann used the Lake Korttajarvi varve XRD proxy such that high XRD means high temperatures.

This is exactly equivalent to using the Jarvykortti River Ice-Out proxy such that river ice persisting later into the spring means high temperatures.

This is a position that should be untenable to anyone willing to exercise their common sense.

You’re right, I’m from Germany.
Although I’m no scientist at all, I’ve learned a lot from Steven’s and Alan’s blogs, together with private guidance from some other well-known sceptic scientists, like Dick Lindzen, Zbigniew Jaworowski, and many more.
And this all for more than 4 years now.
My school english was much worse at the beginning, but I also see this as a good exercise 🙂

But let’s not disturb this very interesting discussion here any longer now. Thanks anyway 🙂

Mann, in the release of this chart states that the Medieval Warm Period is now to be called the the Medieval Climate Anomaly because for a completely inexplicable reason the southern hemisphere was able to almost completely offset the Northern Hemisphere warming or something like that. Apparently Climate Anomalies are able to remain active for 400 years running. Is there any way in which the energy imbalance would be prevented from being spread out through the climate?

You report “Mann, in the release of this chart states that the Medieval Warm Period is now to be called the the Medieval Climate Anomaly because for a completely inexplicable reason the southern hemisphere was able to almost completely offset the Northern Hemisphere warming or something like that. ”

How are the early Southern Hemisphere temperatures calculated? (Remember, you have to turn them upside down because it’s “Down Under”, see 1060002347.txt in the CRU releases).

Once more, laughable science from Mann and his hockey team. The thing is the media is either too stupid or too tied into the political agenda that is warming to question the hockey team. The thing that gets me is the Armagh Observatory has the oldest actual recorded data and shows no significant warming. Yet that little tidbit never seems to be mentioned by the hockey team. Immaterial is such a horrible term for the hockey team to use since politicians the world over are using the garbage they are producing to pass laws for no good reason.

The sheer arrogance of the hockey team is truly astounding. As I said in another thread on the emails, ego is a huge mover for these people. They believe they are correct, can’t be wrong and everyone just shut the hell up because they are smarter than you.

Here’s my simplistic questions
1. Did they publish the weights of the proxies?
2. How much work is it for folks like you to figure them out?
3. Do you have sufficient information to perform the prior calculation if you had the time and inclination?

Re: Ruhroh (#87),
1) They never do. In fact, I think they are not even aware that it is actually possible to calculate the weights when using RegEM.
2) Some work, but not too hard as I know now how, in principle, to do it. The main point is to get the code run. Although it seems much easier than the mess released last time, it is nowhere near “turnkey”.
3) Don’t know what you mean by “prior calculation”. No Bayesian priors there.

Re: David (#88),
if I understood your question correctly, then that needs some time. I guess there is no other way than to get the code run.

Re: Mark (#90),
I noticed that too, and UC is also aware of it. “Fixed” refers to tests of “fixing” the PC1 such that the “effect of CO2 is removed” as was done for AD1000 network (MBH99). The code for doing this seems to be in the \TREE\COMPARE folder. This is a part of MBH9X code which has never been published. Steve had a guess how this thing was done, but we have not been completely sure. Anyhow, there are at least three important points that one can draw immediately from the directory structure:
1) Mann tried the effect of bristlecones for all steps AD1000-AD1400.
2) What is the reason for not using “fixing” for AD1400 network although it was tested?
3) Why in MBH99 there is only AD1000 step when the NOAMER “PCs” are calculated for the steps AD1100, AD1200, and AD1300?

check this out: in mbh98.tar\TREE\ITRDB\NOAMER which is in the documents\mbh98-osborn.zip file there is the infamous BACKTO_1400-CENSORED file AND a bunch of others e.g. BACKTO_1400-FIXED, BACKTO_1400, BACKTO_1300-CENSORED, etc. etc. Haven’t had a chance to analyze yet, but hope others will take a look too!

Re: Mark (#90),
there are other interesting stuff in that file for those of us (Steve, me, UC, … ) who have spent a considerable amount of time trying to replicate MBH9X exactly. For example, mike sent Tim AD1000, AD1400, and AD1600 residuals.

Tim,
Attached are the calibration residual series for experiments based on available networks
back to:
AD 1000
AD 1400
AD 1600
I can’t find the one for the network back to 1820!

Well, mike, you get AD1820 residuals simply by substracting the instrumental series from the final reconstruction…
Anyhow, we digizalized AD1000 residuals from a preprint of MBH99, and those helped us to figure out some things from the algorithm. Now these “new” AD1400 and AD1600 residuals may help at least in the following way: I have long suspected that the proxy network available nowadays for MBH98 is not exactly the same as originally used. Those residuals should give the final answer on that.

The signal to noise ratio on this issue seems low. The terms “upside down” and “data with the axes upside down” are misleading (as other commenters have said), Mann’s comment that changing the sign of a predictor has no influence on the regression is correct. AMac kindly provided me a link to the Tiljander paper

in which she summarizes her findings from first millenium BC forward. The comments on the 20th century make no reference to climate influences – only human activities. If the temperature signal is buried for the 20th century then there is no way to calibrate the proxies to the modern temperature record, no matter what the orientation of the series is. The upside down is meaningful only in that if someone did erroneously try to regress modern temperatures against this data, the historical predictions would actually be the reverse of the Tiljander conclusions.

I posted this question at realclimate

“Is it now the consensus opinion that the Tiljander series
cannot be meaningfully calibrated to modern temperature records?”

and received this response

[Response: I have no idea. But if it can’t then it clearly can’t be used in the approach Mann and colleagues use. Just as well then that they already tested what difference leaving is out makes (answer? very little). – gavin]

There is also extensive discussion at Stoat

With a pretty sensibile post by Connolley.

At this point the sensible thing to do would be for no one to ever regress 20th century temperatures against this data again.

>The terms “upside down” and “data with the axes upside down” are misleading (as other commenters have said), Mann’s comment that changing the sign of a predictor has no influence on the regression is correct.

Yes, the sign of the predictor can be changed to give a positive rather than negative value to a regression; the absolute value of the correlation will the the same in both cases.

But No, upside-down is upside down, with varve X-Ray Densities or with any other candidate thermometer-by-proxy. A proxy flipped upside-down by a computer instead of by the hand of Man [sic] is still upside-down!

A proxy isn’t just a string of numbers, it relates to some physical attribute that–presumbly–itself relates to climate.

More here on an illustration tailered to the Lake Korrtajarvie XRD series.

Question at RealClimate — “Is it now the consensus opinion that the Tiljander series cannot be meaningfully calibrated to modern temperature records?”

[Response: I have no idea. But if it can’t then it clearly can’t be used in the approach Mann and colleagues use. Just as well then that they already tested what difference leaving is out makes (answer? very little). – gavin]

This seems just in time to keep the story on UEA emails alive. As in: the person who says this in a private email, now says this in a peer-reviewed publication. It might be useful if people who have been reading the emails could provide a list of links relevant to this paper.

The red data and red trendline is the CENSORED version, which shows a downtrend, particularly in the 20th century (“hide the decline”?).
The green data and green trendline is the FIXED version, which shows a slight uptrend, particularly in the 20th century.
The blue data and blue trendline is the “BACKTO_1400”-no other label- file, which also shows a bit higher uptrend over the whole period than the FIXED version, but in the 20th century shows a slight downtrend, less than the CENSORED version.

It is also clear that the 3 datasets vary markedly in either direction on a yearly basis throughout the entire 1400-1980 period!

Re: Mark (#100), I replotted the BACKTO_1400-CENSORED vs. BACKTO_1400-FIXED vs. BACKTO_1400 data files using the 20th century data only (1900-1980) and found the same trends noted, only amplified by a factor of 2.5!

Re: Mark (#107), meant to mention the near perfect inverse symmetry of the linear regression lines (the regression equation for the CENSORED data is f(x)=0x – 2.39 and the regression for the FIXED data is f(x)=0x + 2.41). I would wager that selecting a slightly different date range will produce the perfect inverse regressions – i.e. The Flip. What are the chances that the “adjustments” between the CENSORED and FIXED datasets are anything other than flipping the data?

Varved lake sediments are influence by a number of factors in this Finnish region. A few important factors are farming and forestry that influences the mineral matter accumulation. “Primitive” farming was practiced from time to time already some 3000-4000 years ago. However, systematic farming there was introduced as late as some 500 AD (clayey and slightly hilly areas are good for farming, just the landscape that produces nice varved sediments). I doubt that these factors have been studied in detailed even by Tiljander et al, much less by Mann et al. On top of this we have water level fluctuations caused by the post-glacial rebound and potential man-made fluctuations. Whoever is interested can find the Korttajarvi site on Google earth and then surf into the Finnish natural history museums register to compare how densely various habitats etc are in the area (coordinates given). This will give a clue about the potential disturbances in the area. The impact of such disturbances on the varve characteristics must be evaluated before any sound conclusions can be made.

Steve. I hope your move to wordpress is only temporary. WordPress is not available in China as it does not pass through the Great China Firewall. Am I missing anything? I don’t want your site to go alongside Wattsupwiththat which is also banned (for some reason) in China and has been for some time.

In the FOIA2009 collection, there’s a file called briffa_sep98_e.pro that “APPLIES A VERY ARTIFICIAL CORRECTION FOR DECLINE”. For a graph of the effect of these corrections upon a tree ring series, see here. For details of how we obtained the graph, see Climategate section here (at the bottom). The “corrected” graph look similar to the hockey stick. Were these corrected tree-ring graphs used to generate the hockey stick?

1. The known temperature data from c.1850 onwards has no uncertainty. Yet the graphs are different. Surely the fact that significant differances exist should create some uncertainty for each known data set?
2. A proxy temperature record is some other measurement that has a correlation to the temperature record. If the correlation is less than perfect, then should not the uncertainty range increase as one gets further removed from the known data range?

Well it does seem there is agreement that the Tiljander series are not suitable for the Mann et. al. type of study, whether upside down or rightside up. Which makes me wonder why they were used again in this ’09 paper.

I have looked at the graph at Mann’s website comparing “original NH CPS” with “NH CPS minus 7” and other comparisons.

The two curves are very similar although not completely identical. As best I can tell from this very hard to read graph, the series without tree rings and the minus 7 without tree rings diverge more.

I have not reviewed Mann’s data and regression calculations and I’m not sure I want to, but the usual way to do this type of calculation is remove the questioned variable from the model and refit. If you have many highly correlated variables in the model the predictions are usually quite robust to the removal of a small number of variables, so it may not be the case that the weights for either the 7 questioned series or the tree rings are near zero – rather, when these are removed and the model is refit the coefficients of the remaining correlated variables adjust well to compensate. I have seem situations where you can remove any variable from a multivariate model with many correlated predictors and the predicted values hardly budge. If I read their SI correctly the model has over 1000 variables so it’s not surprising that removing 7 wouldn’t change anything.

Teasing out the actual drivers of the predictions from a model with 1000 higly correlated variables is probably impossible.

It’s generally poor technique to build a model like this – pick the most suitable regressors (the proxies that you believe to be least contaminated, or most interpretable ,for example) and leave out the rest so that people can actually interpret the model. Or build separate models for different types of proxies so that the way in which they impact predictions can be evaluated. A bit more respect for the principle of parsimony in model construction would be helpful.

I must say Mann and colleagues would benefit from a study of how to create legible graphs; I would recommend “The Visual Display of Quantitative Information” by Tufte. What happened to the dark blue “NH CPS w/o tree rings” series? It disappears a little before 1800 and I can’t find it again. Must be buried underneath the other thick colored curves.

A minimalist sort of model, or set of models, would be of great interest I think.

Steve: what makes you think that they want their graphs to be legible? There’s a long history on this – better to go to the relevant thread at the main blog climateaudit.org and pursue it there.

If you have many highly correlated variables in the model the predictions are usually quite robust to the removal of a small number of variables, so it may not be the case that the weights for either the 7 questioned series or the tree rings are near zero

And so we come full circle, back to the question of “are the Tiljander proxies used in an upside-down orientation?” (Answers: Lightsum Yes; Darksum No; XRD Yes; Thickness Ambiguous.)

Because these series aren’t flat, e.g. here is the 11-year rolling average of the raw Tiljander XRD data (green line, orientation inverted with respect to Tiljander’s interpretation).

If Fig. 8a shows that addition of two inverted Tiljander series (and five others) leaves the overall trend unaltered–and it does for the uncorrected and twice-corrected Fig. S8a–then the figure legend might as well read, “As a GI-GO test of our model, we added some completely corrupt proxies, and the calculated temperature anomalies were unaltered.”

Either (1) the corrupted proxies were effectively unweighted, (2) by a miracle, the corrupted proxies were in excellent registry with the other proxies, or (3)… I can’t think of a #3.

Similar argument for considering the implication of the corrupted 1850-1995 calibration of all four Tiljander proxies.

“As a GI-GO test of our model, we added some completely corrupt proxies, and the calculated temperature anomalies were unaltered.”

That would be an accurate caption fom everything I’ve learned. The inclusions of the Tiljander series in any orientation makes no sense, and they must know this, so their inclusion in the 09 paper is mystifying.

I’m not sure if you realize this, but this (and many of your other) posts are completely incomprehensible to interested laypersons not steeped in the details of the papers. If you want to persuade people outside generally, you need to set a context for your terms, notation, and significance.

I am not saying you should do this, or that this is your desire. Rather, I am saying that if you want to persuade people generally of the significance of your statements, you have to define the context and terms more clearly.

Steve: this blog is what it is. There are other excellent blogs that appeal to a more general audience.

With all due respect (seriously), at this blog, if you want to understand the “significance, context, and terms” more clearly, you have to read and study the blog. I encourage you to do so- it’s a very interesting trip.

I’ve tried to make a start at addressing one piece of this, even learning to set up a blog.

One of the narrow questions discussed immediately above is “In their 2008 PNAS paper and the 2009 Science paper, did Mann et al use certain ‘Lake Korttajarvi lakebed sediment X-Ray Density record’ in an Upside-Down fashion?”

Steve McIntyre, Ross McKitrick, and many posters here have argued, “Yes, they did (by accident).”

Prof. Mann and his colleagues and supporters respond, “Certainly not!”

The “Certainly not!” rebuttal can be confusing, because it takes many forms. “We didn’t use it Upside-Down!” “There’s no such thing as Upside-Down!” “Our sophisticated computer algorithms render the ‘Upside-Down’ question moot!” “Denialists will say anything, no matter how bizarre, to sow doubt about AGW!”

Read the pro and con arguments for yourself. I can’t give a pointer to one location where succinct arguments are collected; try a ClimateAudit.org search for “Tiljander” (pro) and the three Stoat threads listed here (con). Then look at my “Korttyjarvi River” thought-experiment and see if that helps clarify your thinking. (I’m in the “Yes, they did” camp, if you can’t tell.)

By GI-GO AMac means garbage-in/garbage out, i.e. “We (because we didn’t study our source data thoroughly enough) put in some data that can’t be meaningfully fit to the 20th century temp record into the model. Then we were challenged on it and had to take it out, and the model didn’t change much. (sigh of relief).”

Lots of good references at AMac’s blog. The original Tiljander paper is an interesting climate study with no obvious political bias.

It does seem like slopply book keeping to create a model which is a linear combination of various virtual thermometers, but to have some of them upside down, it makes the model much more difficult to interpret. But the Tiljander data is of no use at all for Mann’s purposes regardless of its orientation, which becomes extremely clear to anyone who spends half an hour reading her paper.

There is plenty of stuff in this thread on “weighting”. What I would really like to see is a list of the sites with the relevant weighting factor that was used by the data analysts.

In Mann’s original (1998) HS data weights were given. In fact, Steve, you might remember sending them to me (thanks again). The numerical values seemed to me to be either idiosyncratic or bizarre, depending on how charitable one felt. What nevertheless surprised me was that when I examined the data ignoring the weights I found little difference in the outcomes. Maybe my analyses were faulty, but testing my weighted regression code against textbook examples indicated that the arithmetic was quite OK.

Thus my sense of surprise at the emphasis here on weights. Perhaps someone can explain.

I appreciate your attempts at clarification in this one instance, but the issue isn’t whether I particularly understand or do not understand the post. The point I am making is that this article and much of the blog generally is not readily comprehensible, even at a high level, to anyone not steeped in the literature. Whether you want this to be the case or not, I cannot say.

I am drawing this fact to your attention for two reasons. First, it is very common for researchers who have been working for a long time in one area not to realize how arcane their pronouncements in that area sound – you may simply be writing obscurely entirely inadvertently. Second, there will likely be more traffic to this blog because of climategate, and, if you had not realized how opaque your posts were to outsiders, you might have wanted to modify the presentation for the sake of these outsiders. (On the other hand, by keeping the posts only understandable to experts, you likely avoid much of the low-bandwidth discussion that occurs on more demotic blogs).

By contrast, say what you will about their underlying science, blogs like RealClimate do insure that virtually all their posts are easy to read for laymen.

Re: fsfsfsfsfsfsfs (#119), That is because RC boils the topics down to oversimplified pronouncements that do not properly reflect the underlying issues. The topics are complex and scientific discussion of them must necessarily reflect that complexity. Unlike RC, this blog is dedicated to the science, not the policy.

I agree with Layman Lurker. Bishop Hill does an excellent job of distillation. (he is a Scotsman after all) 🙂

More importantly there are many other sites which are more accessible such as Wattsupwiththat, ICECAP, Jeff Id’s Air Vent, etc, with varying degrees of technical sophistication. But there is only one site that apparently a good number of the climate scientists themselves can’t grasp the details of.
I myself have a hard time following the more arcane statistical stuff but have benefitted greatly by trying.
So let’s let the other sites distill the info and let Steve keep fermenting it, or perhaps according to The Team, fomenting.

Re: Bishop Hill (#125),
Oh my goodness. Sorry for the mistake; I know it can be a bit of a sore point betwixt the occasional Brit and Scot.
As penance let me again thank you for your talent in making sense of an often convoluted story.

then you should take the high road & check the tree line 🙂
i’ll take the low road & make it up as i go (hard work, but you clarify things for me) thanks for your good work on this & continue please.
ps. i’m a Scot & only look for the facts/best science on this.

Steve McIntyre
The Corrector: Steve McIntyre is a mathematician who worked in ore exploration and drew up feasibility studies for the Canadian government. He analysed Michael E. Mann’s
”hockey stick curve“ and discovered deficiencies in the mathematics that resulted in a dramatic upswing in the graphic. Mann refused requests to make his data available. No references to the hockey stick were made in the 2007 IPCC report.

Thanks for the pointer to the Bishop Hill article, “Caspar and the Jesus paper”. It’s certainly clear, persuasive and well-written. At some point it might be nice if it were footnoted or provided readily verifiable citations or links to each of its claims.

To Jean S and Steve Mc:
What you can do is to write up a small paper, in which you detail out all the incorrect handling and other unscientific manipulation of the data reported in Mann et al’s papers. Submit it as a comment to Science. If it does not work, go for another journal like the open-access Atmospheric Chemistry and Physics Discussion. There, people can openly discuss the pro and cons of any paper submitted. If you have a valid point, this may be the best thing to do. Simply writing blogs like this does help, but it is not loud enough.

It is not only an upside down series, to me they disregard a rich dataset for many parts of the world. Mann and others reconstruction of a colder Central Europe and South America during the MWP is in contradiction to these studies:

Given past discussion and now that Climategate is settling down I wonder if this is worth a revisit.

Despite prior disagreement I personally am still of the opinion that on matters relating specifically to Tiljander we are dealing with an ignorant man.

Probably we can all agree that at the outset he chose Tiljander because he liked the look of the muck without realising it was perverse.

I still take the view that the man did not watch Saturday Night Live and nobody who watched it told him about it. The man admits in Climategate that he rarely looks here.

For a long time until September 2009 Saturday Night Live was the only broadcast where the perverse nature of the muck was highlighted.

In the paper that Steve and Ross wrote they had limited space so they only explained Tiljander was upside down. They did not describe that Tiljander was perverse.

The man in his reply thought he had a gotcha – of course you can have upside down muck!

Privately the man did have a problem with upside down muck. According to his methods the upside muck should have been dumped.

In his new paper the man was still determined to use the upside down muck because it still looks really nice. Still had no knowledge that the muck is perverse.

In order to validly use the muck he invented a two sided bucket so Steve and Ross have no further grounds for complaint.

In September 2009 following Steve’s intervention (and according to Climategate) Kaufman explains to the mans friends that there is an issue with upside down muck. He is encouraged to contact the man (who we know has an awful temper and reacts badly to criticism) and appears to do so with some trepidation in a 3rd email.

Exam question – where in the literature is it described that Tiljander is perverse and how it impacts the three studies to date that use it?

[…] highly critical reviews of his work 5. Need to prove that it is ok for Mann to continue to flip temperature proxies upside down even in his latest papers, even though this egregious error has already been pointed out to him in the past and which he […]

[…] highly critical reviews of his work 5. Need to prove that it is ok for Mann to continue to flip temperature proxies upside down even in his latest papers, even though this egregious error has already been pointed out to him in the past and which he […]

[…] highly critical reviews of his work 5. Need to prove that it is ok for Mann to continue to flip temperature proxies upside down even in his latest papers, even though this egregious error has already been pointed out to him in the past and which he […]

[…] highly critical reviews of his work 5. Need to prove that it is ok for Mann to continue to flip temperature proxies upside down even in his latest papers, even though this egregious error has already been pointed out to him in the past and which he […]