Hanno is the pseudonym for a Wikipedia contributor. The graphic itself compares CO2 levels from Mauna Loa and Law Dome ice core to a splice of the HAdCRU temperature index and the Jones and Mann 2004 reconstruction.

[Update: Moving right along – The Jones and Mann 2004 (= Mann and Jones 2003 reconstruction) uses both Yamal and Mann’s PC1.

The latter splice is, of course, the splice that Mann has informed us is never done by responsible climate scientists, further informing us that the allegation that such splices are done is disinformation by fossil fuel companies.

No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstrution. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.

For the temperature data, see Global temperature 1ka.png. CO2 levels are based on historical carbon dioxide records from ice cores drilled at the Law Dome in Antarctica, published on the web by D.M. Etheridge, L.P. Steele, R.L. Langenfelds & R.J. Francey (1998) as “Historical CO2 records from the Law Dome DE08, DE08-2, and DSS ice cores”. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A. [http://cdiac.esd.ornl.gov/trends/co2/lawdome.html]

For the years 1000–1880, temperature estimates were taken from P.D. Jones & M.E. Mann (2004): “Climate over past millenia”. Reviews of Geophysics, 42, article number RG2002. For the remainder, temperatures are based on instrumental records published on the web by P.D. Jones, D.E. Parker, T.J. Osborn & K.R. Briffa (2005) as “Global and hemispheric temperature anomalies – land and marine instrumental records”. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center, Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, Tenn., U.S.A. [http://cdiac.esd.ornl.gov/trends/temp/jonescru/jones.html]

However, the point of my comment is that there was no further or full reference provided by the UN in the report which had used it. There is therefore no trace to follow in the UN report.

As I said I had to google for Hanno 2009 and temperature to find the source and I was expecting to find a scientific paper. On a thread in peer-review I hope you can see how this is an issue. If an undergrad used a wikipedia graph they would be in trouble.

I’d have thought the IPPC could have done better than that, especially since the graph is beiNg shown to provide some kind of visual proof of correlation. The implication is that the flat shaft of both hockey sticks represents the halcyon days of yore (i.e. pre-industrial)

If the undergrad used a wikipedia graph without providing the proper URL credit to the wikipedia source, as UNEP did here, he or she would be in trouble. Just “Hanno 2009″ is not an adequate citation. Still, http://commons.wikimedia.org/wiki/File:CO2-Temp.png, is a pretty lame source even for an undergraduate paper, even if it does contain a link to another page with a link to a CDIAC data file.

This is worth noting but is getting far from the original topic. Perhaps Steve might want to make it a new thread, with a comment about the Jones Parker and Briffa reconstruction?

To show that I’ve not “got the wrong end of the telescope”: I completely agree. I go one further and state that it is misleading, because the impression it gives is that it come from the primary published literature. But the full reference is missing, and it turns out that it did not come from the primary published literature (although the source data did). The question remaining is whether there is, in the priamry published literature, any graphic that overlays CO2 onto temperature that could have been used in place of the Hanno graphic? i.e. Was Hanno just convenient, or does this graphic combine data in a way that no credible authority ever would?
.
Contrast this story with Joe Bastardi using Ryan Maue’s graphic. Should Bastardi credit Maue for the graphic, or the people who actually counted storms over the years and generated the data that Maue used? UN did exactly what Maue asks – credit the graphic artist.
.
The cases are not identical, of course. But this only illustrates how trivial the issue really is. Skywalker had to spend 2 minutes doing a google search to track down the true data source. Big deal.

It wasn’t missed the first time. I simply ignored your point because it is, as I pointed out, trivial. Rather, I chose to emphasize a different point regarding your editorial discretion.
.
Any further “startling insights” of your own to report, or is this it?
.
This is how you treat people who decide to agree with you?

Bender you did add “big deal” to your supposed agreement and now “trivial”. If you hadn’t gone off on a hostile tangent with 24, 25, 26 we’d have seen eye to eye. My post wasn’t sensationalised by omission or editorial discretion – you were simply attacking a point I wasn’t even trying to make.

Do we agree:

a) It’s generally bad practice not to cite a reference in any field. In this case it is supposition as to why it happens to be a Wikipedia one, which would have generally carried less credibility.

b) The graph is used to insinuate a point which CA has (as I know you’re aware) been looking at for ages. Hu’s splice comment adds even more to this issue. Is this a scientifically valid graph – back on topic – peer-reviewed etc, can the two graphs simply be superimposed?

Finally whatever the source the fact that a 2009 report uses a figure which ends its trend in 2001/02 could also be called convenient since we do have more recent data.

In academic literature, a reference is a previously published written work within academic publishing which has been used as a source for theory or claims referred to which are used in the text. References contain complete bibliographic information so the interested reader can find them in a library.

No researchers in this field have ever, to our knowledge, “grafted the thermometer record onto” any reconstrution. It is somewhat disappointing to find this specious claim (which we usually find originating from industry-funded climate disinformation websites) appearing in this forum.

Now his quote probably pre-dates his own work but RegEM which is used in EIV of Mann08 does graft the spaghetti noodles together onto the obvioiusly perfect temperature record.

Re: Jeff Id (#15),
Note how he proves the medieval warm period on page 5 (http://www.meteo.psu.edu/~mann/shared/articles/MannetalPNAS08.pdf) and also grafts a five year moving average of HadCrut temperature onto a 40 year moving average temperature reconstruction. Of course, the HadCrut 40 year average temperature graph would look utterly unimpressive on this graph for the purposes of unprecedented!

I’ve now downloaded the UNEP Science Compendium. Since reading just a few pages of the prefectory material made me woozy, I’m wondering if there are particular parts of the compendium which may be worth reading, either for edification or for yucks?

Note that the reconstructed portion is heavily smoothed — perhaps with a 50 or even 100 year effective period, while the instrumental portion is much less smoothed, giving the impression of much greater volatility in the past century.

If the instrumental data were comparably smoothed, the smoother would have to end half the filter width before the end of the series to be legitimate, and so would not show the 1990s peak except as it averaged into the smoothed data.

Even these CIs are probably somehow understated, but the current warming would have looked a lot less impressive if it had been compared to this series after smoothing and truncation at half the filter width.

Hanno’s two trendlines are discontinuous at 1900, allowing the trend since 1900 to be higher than it would be if the two trendlines were piecewise linear but continuous (a “1st-degree spline”).

It’s intriguing that the CI shrinks dramatically in the 1800s. Is this because of the smoothing which is looking forward to the zero-CI (relatively speaking) of the instrumental portion?

I seems Hanno’s full name is Hanno Sandvik a researcher at the Norwegian University of Technology and Science with a PhD in biology/ecology. So for some reason UNEP used his first name in the graph caption, I guess they’re all buddies :)

So, is he really the source for this graph? At least he claims so himself. I’m afraid the text is in Norwegian, published in a Norwegian popular science magazine. I don’t think they do much peer review though.

He has listed his publications in Norwegian and English, the English list is shorter.

What would you expect from a UNEP report whose first sentence (by UNSG Moon) says: “The science has become more irrevocable than ever: Climate change is happening.”

1. “Irrevocable” is an absolute adjective and does not admit of more or less.
2. If it’s science, it’s never irrevocable.
3. When was climate change not happening?
4. When did science ever indicate that climate change was not happening?

(a) Right, the greenhouse effect is not finally proven. But there is more to it: the greenhouse effect is unprovable. As are all other scientific theories. There is some evidence for a human-made global warming. Of course there are other hypotheses explaining global warming, but the greenhouse effect has so far not been falsified, that is why we have to take it seriously. Obviously, the geochemist hasn’t heard of Popper.

(b) Politicians who believe in the greenhouse effect, are not following a prophecy, they are acting according to the precautionary principle. The probability that the greenhouse effect is a “true” hypothesis, is not 100% – but it will never be, no matter how long you wait for more evidence. And we have no time to wait longer because the danger of the greenhouse effect is too big. In other words: these politicians believe that the risk is too great to be taken. (By the way: were are those great politicians the geochemist is complaining about?) Obviously, the geochemist hasn’t heard of risk, nor of the precautionary principle.

(c) Yes, it has been warmer in earlier times, for example when the vikings lived. But other things were different at that time, too: the world population, for example. A higher sea level has much more serious consequences today, with not only coast regions densely populated, but also the regions behind. Vikings could simply avoid drowning by following the rising ocean, moving up the landscape. But what was possible for them, will provoke ethnic conflicts today. Obviously, the geochemist committed a simple logic mistake, deriving one statement, “X is ok under circumstance Y”, from another, “X is ok under circumstance Z”.

Good point! Since the hockey stick, “science” says climate was stable for 1000 years. This shows how much you can do with noisy data. And the worse the method, the higher the noise to signal ratio, the flatter the line, and the more “stable” the past climate.

Funny thing is, it’s almost the opposite with the future. There, the lousier the method, the worse the climate model, and the faster things go haywire. The more things you get wrong, the more unstable the result, and the more “catastrophic” the future climate.

If we look a little more closely, the “stable” climate had a slightly falling trend. Now if the proxies were good, we could not predict what they might do outside the “correlation period” in which they have been selected for shooting up. But if the proxies were practically 100 per cent noise, then the “shoot up” phase would have been fortuitous and the proxies would gradually regress to the mean outside that phase. If you went back, say, 1000 years, they would be back up at the midpoint of the range they covered in the “shoot up” phase at the end. What do you know, that’s just where they are – at 13.75 degrees (omitting the last uptick which does not come from the proxies but from the spliced-in “instrumental record”).

Well well well – so lousy science plus rubbish data would produce a hockey stick. Would never have guessed.

Interestingly (?), Hanno is also the author of a non-hockey stick graph that can be found in the Norwegian Wikipedia article about the Migration Period (Norwegian: Folkevandringstiden) that shows a more Lamb-like relationship between the MWP and the current warm period:

Since I can read Norwegian I can see that Hanno Sandvik is a very, very, very PC person. I am somewhat less impressed by his scientific acumen. Among other thing he claims that for each degree of warming mortality will increase by 4% among auks in Barents Sea. Living as I do on the Baltic, which is much warmer than the Barents Sea, and where the same species of auks are thriving as never before in historical times I tend to think that overfishing may be rather more important than climate change in this particujlar case.

It’s quite amazing how many people who pretend that Mann’s hockey curve has not been proven wrong. It’s also amazing how many people who seems not aware of the fact that stomata data give a completly different picture of past CO2-levels than ice core data.

Hu is right, this is a splice and to properly smooth the end of the graph, just as the previous record is smoothed, reuquires waiting at least 20 to 25 years. The only other problem is that it concentrates only on CO2. Even if CO2 provided some warming, it does not provide all of it and the other factors should be accounted for, as well. So, even if this spliced graph could be smoothed, it would still be inaccurate.

The only other problem is that it concentrates only on CO2. Even if CO2 provided some warming, it does not provide all of it and the other factors should be accounted for, as well.

CO2 actually enters three ways: First, it may be causing some GHG warming. Secondly, warming (caused by solar activity or something else) may be releasing CO2 from the oceans. And third, CO2 stimulates tree growth (and biosilica growth in Arctic lakes), and therefore may account for some of the correlation between tree rings and temperature in the calibration period. But since Mann and Jones didn’t take this into account, their reconstruction as used by Hanno in the UNEP report may be erroneous, even apart from the even more glaring stripbark issue. (MBH 99 did do a bogus calculation they said compensated for CO2 fertilization, but in fact they were just hand-shaping the HS shaft.)

But Ron is right that other factors, like solar activity, may be causing warming. Since atmospheric C14 is apparently negatively related to solar activity, and we have a good record of it even back before anyone was counting sunspots from the C14-dating dendrocalibration literature, it might make a good proxy for paleo temperature to be added to the others. C14 has been used occasionally as a temperature proxy — Lonnie Thompson actually used the Wolf Minimum to date his Kilimanjaro ice cores, but it has generally been neglected. Since 1950, however, the C14 solar signal has been entirely masked by nuclear testing, so we can’t use it to account for the recent warming.

Looking at the (presumably) raw CO2 data at http://cdiac.esd.ornl.gov/ftp/trends/co2/lawdome.combined.dat it seems that, in the first two sets of data, the age of the air is always 30 years less than the age of the ice. So ice crystals dated to 1948 are measured in 1993 and the air in them is found to have an average age of 1978. Likewise an ice crystals from 1802 are measured in 1993 and their air has an average age of 1842.

This doesn’t quite make sense. I assume there is diffusion of air in both directions, from below new ice and from the atmosphere above the ice. So the ice that accumulated in 1802 had older air diffusing from below and newer air from above. This diffusion is presumably a rapid process. I am not talking about molecular diffusion through an ice lattice that is slow but continuous, but rapid diffusion through air pockets in unpacked ice. If the 1802 ice has 30 year younger air in it, then it had diffusion from below for decades as well. Thus the distribution of air ages in the 1802 sample came from a range of perhaps 1782 to 1902 with for an average age of 1842. I realize that the distribution is heavily skewed to the right since diffusion from above will be more rapid than from below, so change that assumption to 1782 to 1872 with a weighted average of 1842.

What that would mean is that the 1948 ice would have an air age distributed from 1928 to 2018. Obviously this is impossible when the measurement was taken in 1993. The larger point I want to make is that the measurements are a right-of-center-weighted average of perhaps 100 years total range of air ages (and double that for centuries-old ice). This measurement is completely inappropriately spliced to an exact instrument measurement. At least with the temperature proxy hockey sticks, the proxy measurements (e.g. tree ring size) are assigned to single years although they may be a poor representation of temperature for some or many of those years. The air from ice measurements are not specific to a year but a weighted average of many years. So it is a much poorer practice to splice them to recent instrument measurements.

Eric, once the bubbles in the ice cores are closed, there is simply no migration anymore. At that point the ice is 40 years old (for the Law Dome ice cores). What matters is that there is gas migration top down for the upper 70 somewhat meters until full closing, which makes that the average gas age at that depth is only 10 years. After full closing, the 30 year difference between ice age and gas age doesn’t change anymore, no matter when you do the measurements, even for the 800,000 years old Dome C ice core (which has a gas-ice age difference of several hundred years).
See: http://www.ferdinand-engelbeen.be/klimaat/klim_img/law_dome_overlap.jpg

BTW did you notice the very fuzzy and poor quality of the (badly scanned?) artwork, like drawings and graphs?
And please, read carefully the caption of fig.1.1 explaining the greenhouse effect. It seems like the visible part of solar radiation does not cause any heating anymore, only UV does! So lets cover the globe by UV blockers, and all problems will be solved…

I thought all this guff had been consigned to the dustbin of history.
Wikipedia? It need not be said. I use a PC but it doesn’t use me, computers are a useful tool, and tools can use them for falsification and downright skewed purposes.

I volunteer teach a computer class, and this is one of the principles I stress to my class from day one. I feel the need to stress this because from my experience, a significant number of people out there do not understand what the relationship between computer and human being is supposed to be.

In 2008 the World Climate Research Program and the World Weather
Research Programme convened a group of experts to review the current
state of modelling, and to suggest a strategy for seamless prediction of
weather and climate, from days to centuries. A major conclusion of the
group was that projections from the current generation of climate models
were insufficient in providing accurate and reliable predictions of regional
climate change, including the statistics of extreme events and high impact
weather, which are required for regional and local adaptation strategies.
A new modelling system has been designed that predicts both internal
variability and externally forced changes, and forecasts surface temperature
with substantially improved skill throughout a decade, globally and
in many regions (WMO 2009).

Two points. It seems first of all that this shows that modeling can’t be used to verify calibration of climate data or proxy data. Secondly, note the phrase “has been designed”. Since this review was performed sometime in 2008 and its report presumably issued at least months later, just how much “design” and at what level of detail can have been produced since then?

Anyone have any info about this “programme”? And when they admit regional climate change leaves something to be desired, are they also saying that global climate change is contra-wise easier and more accurate?

In defense of Wikipedia, 90% of the time I have found that its articles are very informative, and often even correct.

But the nature of Wikipedia is that anyone can edit it, so it’s easy to amend say the Antarctica article to state that Antarctica is made of Green Cheese, and that is what it will say until (and if) someone takes the trouble to change it.

So, look to Wikipedia for basic information, and credit it if it is helpful, but check its references for the real story.

If UNEP finds that a Wikimedia graph is exactly what is appropriate, there is no reason it should not use it, with art credit to the (usually pseudonymous) Wiki contributor. In the present instance, UNEP did not make it clear that its art source was Wikimedia, however.

If UNEP finds that a Wikimedia graph is exactly what is appropriate, there is no reason it should not use it, with art credit to the (usually pseudonymous) Wiki contributor.

What I find hard to believe is that an organization of the size of the UN and which will routinely publish documents for internal and public use, would not have a team of graphic artists on staff or at least on contract.

Can someone please explain how such “science” report passed any serious peer review process? Are the pro-AGW believers getting so desperate now that they will use falsified information to push their case? I might believe this was an honest mistake but I don’t for the simple fact that too many people must have been involved to produce the report. Surely at least one of them would have understood it was in error. Or could it be they knew the error but deliberately ignore it? Why?

They don’t obey their own rules – or at least, their subsets don’t obey the rules defined by other subsets, despite all the subsets’ shared claims of a consensus. It’s bad that this discussion is not really about science because so many players in it are so manifestly dishonest.

Moreover, this episode shows that climate science – and maybe not just “alarmist” climate science – doesn’t require too much expertise. The “splicing” is bad but the overall quality of this work is not too different from the stuff that is being routinely published.

RE #30,
My bad — the reconstruction is Jones and Mann (R Geoph 2004), not Jones, Parker and Briffa 2005. The latter is the instrumental series.

In any event, Hanno’s graph of temperatures without CO2 splices the two series together as if they were one, something that the Team supposedly never indulges in. In fact, the reconstructed portion is heavily smoothed — perhaps with a 50 or even 100 year MA, while the instrumental portion is much less smoothed, giving the impression of much greater volatility in the past century.

Ferdinand, thanks for that explanation. I assume the age of ice comes from counting annual layers which is precise. But is there any way to independently date the CO2 from the bubbles, other than working backwards from the mixing ratio? It seems quite sound to do that for the period of instrument readings and thus determine the sealing time. But do we have long enough period of instrument readings to get perform that calibration?

Because the average gas age of 10 years (and not zero years) at closing depth, there is indeed a distribution of gas ages in the newly closed bubbles. Also the bubbles obviously don’t instantaneously close. Do we have any estimate of what that distribution looks like? This is important because we are still comparing smoothed values to the instrument readings in the chart.

But I must caveat that just because the CO2 measurement is smoothed doesn’t mean there were any natural fluctuations in the past, only that if there were fluctuations we would not see them. That means the splicing is inappropriate.

Eric, much work was done in the early days of ice core analyses to determine the gas age and spread at closing depth. Gas diffusion is mainly a matter of pore diameter and diffusion speed, this can be calculated and the theoretical calculation was confirmed by measuring CO2 levels at different depths in the still open pores of firn of the Law Dome ice cores. Two of the three Law Dome ice cores have a mean averaging of the gas age of about 8 years over the full transition of bubble closing. The third (smaller layers, longer time period for closing) of 40 years. Vostok has several hundreds of years smoothing…

Further there is some (small) enrichment of the heavier molecules/atoms with depth (caused by gravity) for which is corrected. For the coldest and highest places inland like Vostok, the layers of snow are very tiny, which means that one need more pressure (= more depth) and thus more years/ice layers for the start of closing the bubbles, and more years before all bubbles are closed, thus more smoothing. To make it even more complicated, during the cold glacial periods, the difference between ice age and gas age even increases, due to less precipitation and colder temperatures… Thus determinating the gas age – ice age difference and the spread, especially for the longest time spans, is not so easy.

Despite that, the CO2 ranges of completely different ice cores overlap each other, but with increasing smoothing towards the past, as longer time periods need smaller layers.

The Law Dome ice cores with 8 years averaging and the one with 40 years averaging overlap with the direct measurements at the South Pole for the same average gas age, but indeed any relative small variability of CO2 will not be noticed in the ice cores… The current year by year natural variability of +/- 1.5 ppmv anyway is completely smoothed out, but the trend still is highly visible.

“…much work was done in the early days of ice core analyses to determine the gas age and spread at closing depth. Gas diffusion is mainly a matter of pore diameter and diffusion speed, this can be calculated and the theoretical calculation was confirmed by measuring CO2 levels at different depths in the still open pores of firn of the Law Dome ice cores.”

How can you confirm anything when you have no way of verifying the actual atmospheric conditions when the snow/ice was originally laid?? You are assuming that the conditions 1000’s, 10’s of thousands, and 100’s of thousands of years ago are close enough to current that your current experiments are applicable.

I also doubt you have enough control over the ice core to prevent cracks, diffusion, possibly even chemistry as it is drilled, retrieved, stored, shipped, and analyzed. I believe early cores also had a problem with contamination from drilling fluid.

“Deep ice is under great pressure. When brought to the surface, there is a drastic change in pressure. Due to the internal pressure and varying composition, particularly bubbles, sometimes cores are very brittle and can break or shatter during handling. At Dome C, the first 1000 m were brittle ice. Siple dome encountered it from 400 to 1000 m. It has been found that allowing ice cores to rest for some time (sometimes for a year) makes them become much less brittle.

Decompression causes significant volume expansion (called relaxation) due to microcracking and the exsolving of enclathratized gases. Relaxation may last for months. During this time, ice cores are stored below -10 °C to prevent cracking due to expansion at higher temperatures. At drilling sites, a relaxation area is often built within existing ice at a depth which allows ice core storage at temperatures below -20 °C.

It has been observed that the internal structure of ice undergoes distinct changes during relaxation. Changes include much more pronounced cloudy bands and much higher density of “white patches” and bubbles.

Several techniques have been examined. Cores obtained by hot water drilling at Siple Dome in 1997-1998 underwent appreciably more relaxation than cores obtained with the PICO electro-mechanical drill. In addition, the fact that cores were allowed to remain at the surface at elevated temperature for several days likely promoted the onset of rapid relaxation.”

The sun may have increased in brightness over the last couple of decades as summarized in the following press release:

Researcher Finds Solar Trend That Can Warm Climate

Ends debate over whether sun can play a role in climate change

Since the late 1970s, the amount of solar radiation the sun emits during times of quiet sunspot activity has increased by nearly .05 percent per decade, according to the study. “This trend is important because, if sustained over many decades, it could cause significant climate change,” said Willson, a researcher affiliated with NASA Goddard Institute for Space Studies and the Earth Institute at Columbia University, and lead author of the study recently published in Geophysical Research Letters.

“Historical records of solar activity indicate that solar radiation has been increasing since the late 19th century,” says Willson. “If a trend comparable the one found in this study persisted during the 20th century it would have provided a significant component of the global warming that the Intergovernmental Panel on Climate Change report claims to have occurred over the last 100 years.”

Willson found errors in previous satellite data that had obscured the trend. The new analysis, Willson says, should put an end to a debate in the field over whether solar irradiance variability can play a significant role in climate change.

The solar cycle occurs approximately every 11 years when the sun undergoes a period of increased magnetic and sunspot activity called the “solar maximum,” followed by a quiet period called the “solar minimum.” A trend in the average solar radiation level over many solar magnetic cycles would contribute to climate change in a major way. Satellite observations of total solar irradiance have now obtained a long enough record (over 24 years) to begin looking for this effect.

Total Solar Irradiance (TSI) is the radiant energy received by the Earth from the sun over all wavelengths outside the Earth’s atmosphere. Its interaction with the Earth’s atmosphere, oceans and land masses is the biggest factor determining the Earth’s climate. To put it into perspective, decreases in TSI of 0.2 percent occur during the week-long passage of large sunspot groups across our side of the sun. These changes are relatively insignificant compared to the sun’s total output of energy, but are equivalent to all the energy that mankind uses in a year. According to Willson, small variations like the one found in this study, if sustained over many decades, could have significant climate effects.

In order to investigate the possibility of a solar trend, Willson needed to put together a long-term dataset of the Sun’s total output. Six overlapping satellite experiments have monitored TSI since late 1978.The first record came from the National Oceanic and Atmospheric Administration’s (NOAA) Nimbus7 Earth Radiation Budget (ERB) experiment (1978-1993). Other records came from NASA’s Active Cavity Radiometer Irradiance Monitors: ACRIM1 on the Solar Maximum Mission (1980-1989), ACRIM2 on the Upper Atmosphere Research Satellite (1991-2001) and ACRIM3 on the ACRIMSAT satellite (2000 to present). Also, NASA launched its own Earth Radiation Budget Experiment on its Earth Radiation Budget Satellite (ERBS) in 1984. And, the European Space Agency’s (ESA) SOHO/VIRGO experiment also provided an independent data set during 1996-1998.

In this study, Willson, who is also Principal Investigator of the ACRIM experiments, compiled a TSI record of over 24 years by carefully piecing together the overlapping records. In order to construct a long-term dataset, Willson needed to bridge a two-year gap (1989-1991) between ACRIM1 and ACRIM2. Both the Nimbus7/ERB and ERBS measurements overlapped the ACRIM ‘gap.’ Using Nimbus7/ERB results produced a 0.05 percent per decade upward trend between solar minima, while ERBS results produced no trend. Until this study, the cause of this difference, and hence the validity of the TSI trend, was uncertain. Now, Willson has identified specific errors in the ERBS data responsible for the difference. The accurate long-term dataset therefore shows a significant positive trend (.05 percent per decade) in TSI between the solar minima of solar cycles 21 to 23 (1978 to present).

The ACRIMSAT/ACRIM3 experiment began in 2000 and will carry out long-term solar observations for at least five more years. The instrumentation for the ACRIMSAT/ACRIM3 experiment was the latest in a series of ACRIM’s developed for satellite experiments by Willson and the Jet Propulsion Laboratory (JPL) of the California Institute of Technology. JPL operates the ACRIMSAT/ACRIM3 experiment for Willson using their tracking station at the Table Mountain Observatory in California. One of the missions of NASA’s Earth Science Enterprise, which funded this research, is to study the primary causes of climate variability, including trends in solar radiation that may be a factor in global climate change.

The key quote is “The accurate long-term dataset therefore shows a significant positive trend (.05 percent per decade) in TSI between the solar minima of solar cycles 21 to 23 (1978 to present).” Such as an increase is sufficient to explain most of the observed warming.

Even if this result is not correct, there are reasons to believe the sun varies in brightness over decades and centuries. The evidence includes:

1. Variations in sunspot structure and the Earth’s temperature closely followed each other from 1874 to 1976. Sunspot structure provides a measure of the strength of small scale turbulence in the sun and hence indicates long-term changes in solar luminosity.
2. Changes in solar cycle length closely follow changes in sunspot structure and the Earth’s temperature. The changes in cycle length are probably caused by changes in the large scale turbulence of the sun as reflected in meridional flows and hence provide more evidence for solar luminosity changes.
3. Sunspots are particularly long lived during the Maunder Minimum (1645-1715). This indicates reduced turbulence in the sun and reduced luminosity and it is reflected by a cool Earth.
4. Numerous solar and Earth climate proxies are correlated indicating the sun is a major driver of climate change.

Flease forgive me if it has allready been noted, I don’t have time to sift through all 80 comments.
I started to read the UNEP Compendium and stumbeled into another factual error. When mapping “Unusual Weather Events” they claimed that 2009 september value was the “seond lowest”. This must be wrong, at least according to JAXA and Nansen Arctic ROOS it is the third!

2. All the graphs must be corrected. There are only two multiproxy reconstruction studies that do not depend on the flawed california pines and yamal larch chronologies. Steve mentioned them by name about a week ago. And there is the Loehle & McCulloch version – which many refuse to accept.

did you read the paper or just look at the figure on the website? It is pretty interesting what the recent (last 100 years) trend is in the west pacific warm pool. For instance, did you see Cravette et al., 2009? And I wouldn’t say that the Oppo data is a hockey stick… but anyone could see that if they read the paper.

Re: bent-out-of-shape (#94),
It’s an interesting paper that merits attention. That’s why I linked to it. If you would prefer to see the paper linked to a different thread, you are most welcome to do so.

bender – yes it is interesting. It shows a cooling in the warm pool during the little ice age. Also there is a hydrologic response in the warm pool. it is adding to the growing evidence that the MWP and LIA were a reorganization of heat between the northern and southern hemispheres. as a result, the monsoon weakens during the LIA, and the ITCZ is thought to have moved southward. Opposite for the MWP. So while the effects of LIA and MWP were global, both hemispheres didn’t do the same thing. It was more like heat transferred from one to the other.

I saved the Earth Systems chapter with the hockey stick Fig. 1.3 from the original September 24 posting. Contact me at GILLESTED@aol.com if you want a copy of my file and also if you want my assessment sent to selected media outlets, congressmen and senators.

The latter splice is, of course, the splice that Mann has informed us is never done by responsible climate scientists, further informing us that the allegation that such splices are done is disinformation by fossil fuel companies.

Mann said he didn’t graft on the “thermometer” record. That is not the same as the CO2 record, or a thermometer record adjusted with CO2 data.

[…] Spot the stick Commenters who threaten anyone while here because they are not smart enough to come up with a better answer will have some due diligence done on them. Foul mouthed lefty posters beware. […]

[…] has turned into a minor embarrassment for the United Nations in the climate debates. As first reported on ClimateAudit.org, the origin of a graph used in last week’s UN climate report, published to coincide with the […]

[…] into a minor embarrassment for the United Nations in the climate debates. As first reported on ClimateAudit.org, the origin of a graph used in last week’s UN climate report, published to coincide with the […]

[…] Don’t Forget to Check Your Work: Last week, a reader wrote in for more information about which 2007 IPCC predictions have proven to be too modest. One of the sources I suggested as reference was the UNEP’s Climate Change Science Compendium 2009, a review of the professional literature for policymakers in advance of the Copenhagen meeting. The next day, thinking about the recently exposed IPCC error, I started checking through the footnotes. In the opening few pages of my hard copy, there’s a reproduction of the famous “hockey stick” graph, showing proxy evidence for temperature and CO2 over the last 1,000 years or so. I saw the reference, to “Hanno 2009,” and looked it up in the bibliography, but it wasn’t there. Another boneheaded fact-checking mistake, I thought. It’s actually worse than that. “Hanno 2009″ isn’t a peer-reviewed journal article at all but a Wikipedia entry (!). Steve McIntyre of ClimateAudit.org had found it last September and written about it here. […]

[…] Don’t forget to check your work: Lastweek, a reader wrote in for more information about which 2007 IPCCpredictions have proven to be too modest. One of the sources Isuggested as reference was the UNEP’s Climate Change Science Compendium 2009, a review of the professional literature for policymakers inadvance of the Copenhagen meeting. The next day, thinking about therecently exposed IPCC error, I started checking through the footnotes.In the opening few pages of my hard copy, there’s a reproduction of thefamous “hockey stick”graph, showing proxy evidence for temperature and CO2 over the last1,000 years or so. I saw the reference, to “Hanno 2009,” and looked itup in the bibliography, but it wasn’t there. Another boneheadedfact-checking mistake, I thought. It’s actually worse than that. “Hanno2009” isn’t a peer-reviewed journal article at all but a Wikipediaentry (!). Steve McIntyre of ClimateAudit.org had found it lastSeptember and written about it here. […]

[…] Suspect Graphs On Wikipedia Posted on April 30, 2011 by simonjmeath This Wikipedia contributor goes by the pseudonym of ‘Hanno’, and has been linked to other dubious graphs on Wikipedia – such as a graph which spliced Mann and Jones’ infamous ‘Hockey Stick’ Graph with the Hadley/CRU temperature index and a graph of CO2 concentration according to ice cores – which even Mann said is something never done by responsible climate scientists (see Steve McIntyre’s analysis: http://climateaudit.org/2009/09/25/spot-the-hockey-stick-n-2/). […]