Paper demonstrates solar activity was at a grand maximum in the late 20th century

Solar activity measured by isotope proxies revealed the end of 20th century was the highest activity in 1200 years

A 2010 paper (that I somehow missed) was recently highlighted by the blog The Hockey Schtick and I thought it worth mentioning here even if a bit past the publish date.

The work by Ilya G. Usoskin of the Sodankyla Geophysical Observatory at the University of Oulu, Finland was published in Living Reviews of Solar Physics. The paper examines records from two isotope proxies (Be10 and C14) and finds that solar activity at the end of the 20th century was at the highest levels of the past 1200 years. Excerpts follow along with a link to the full paper.

A History of Solar Activity over Millennia
Ilya G. Usoskin, Sodankyla Geophysical Observatory (Oulu unit), University of Oulu, Finland

Presented here is a review of present knowledge of the long-term behavior of solar activity on a multi-millennial timescale, as reconstructed using the indirect proxy method. The concept of solar activity is discussed along with an overview of the special indices used to quantify different aspects of variable solar activity, with special emphasis upon sunspot number.

Over long timescales, quantitative information about past solar activity can only be obtained using a method based upon indirect proxy, such as the cosmogenic isotopes 14C and 10Be in natural stratified archives (e.g., tree rings or ice cores). We give an historical overview of the development of the proxy-based method for past solar-activity reconstruction over millennia, as well as a description of the modern state. Special attention is paid to the verification and cross-calibration of reconstructions. It is argued that this method of cosmogenic isotopes makes a solid basis for studies of solar variability in the past on a long timescale (centuries to
millennia) during the Holocene.

A separate section is devoted to reconstructions of strong solar–energetic-particle (SEP) events in the past, that suggest that the present-day average SEP flux is broadly consistent with estimates on longer timescales, and that the occurrence of extra-strong events is unlikely. Finally, the main features of the long-term evolution of solar magnetic activity, including the statistics of grand minima and maxima occurrence, are summarized and their possible implications, especially for solar/stellar dynamo theory, are discussed.

4.4 Grand maxima of solar activity
4.4.1 The modern episode of active sun
We have been presently living in a period of very high sun activity with a level of activity that is unprecedentedly high for the last few centuries covered by direct solar observation. The sunspot number was growing rapidly between 1900 and 1940, with more than a doubling average group sunspot number, and has remained at that high level until recently (see Figure 1). Note that growth comes entirely from raising the cycle maximum amplitude, while sunspot activity always returns to a very low level around solar cycle minima. While the average group sunspot number for the period 1750 – 1900 was 35 ± 9 (39 ± 6, if the Dalton minimum in 1797 – 1828 is not counted), it stands high at the level of 75 ± 3 since 1950. Therefore the modern active sun episode, which started in the 1940s, can be regarded as the modern grand maximum of solar activity, as opposed to a grand minimum (Wilson, 1988b).

Figure 1: Sunspot numbers since 1610. a) Monthly (since 1749) and yearly (1700 – 1749) Wolf sunspot number series. b) Monthly group sunspot number series. The grey line presents the 11-year running mean after the Maunder minimum. Standard (Z¨urich) cycle numbering as well as the Maunder minimum (MM) and Dalton minimum (DM) are shown in the lower panel.

Is such high solar activity typical or is it something extraordinary? While it is broadly agreed that the present active sun episode is a special phenomenon, the question of how (a)typical such upward bumps are from “normal” activity is a topic of hot debate.

…

6 Conclusions

In this review the present knowledge of long-term solar activity on a multi-millennial timescale, as reconstructed using the indirect proxy method, is discussed.
Although the concept of solar activity is intuitively understandable as a deviation from the “quiet” sun concept, there is no clear definition for it, and different indices have been proposed to quantify different aspects of variable solar activity. One of the most common and practical indices is sunspot number, which forms the longest available series of direct scientific observations. While all other indices have a high correlation with sunspot numbers, dominated by the 11-year cycle, the relationship between them at other timescales (short and long-term trends) may vary to a great extent.

On longer timescales, quantitative information of past solar activity can only be obtained using the method based upon indirect proxy, i.e., quantitative parameters, which can be measured nowadays but represent the signatures, stored in natural archives, of the different effects of solar magnetic activity in the past. Such traceable signatures can be related to nuclear or chemical effects caused by cosmic rays in the Earth’s atmosphere, lunar rocks or meteorites. The most common proxy of solar activity is formed by data from the cosmogenic radionuclides, 10Be and 14C, produced by cosmic rays in the Earth’s atmosphere and stored in independently-dated stratified natural archives, such as tree rings or ice cores. Using a recently-developed physics-based model it is now possible to reconstruct the temporal behavior of solar activity in the past, over many millennia. The most robust results can be obtained for the Holocene epoch, which started more than 11,000 years ago, whose stable climate minimizes possible uncertainties in the reconstruction.

An indirect verification of long-term solar-activity reconstructions supports their veracity and confirms that variations of cosmogenic nuclides on the long-term scale (centuries to millennia) during the Holocene make a solid basis for studies of solar variability in the past. However, such reconstructions may still contain systematic uncertainties related to unknown changes in the geomagnetic field or climate of the past, especially in the early part of the Holocene.

Measurements of nitrates in polar ice allow the reconstruction of strong solar energetic particle (SEP) events in the past, over the five past centuries. Together with independent measurements of the concentration of different cosmogenic isotopes in lunar and meteoritic rocks, it leads to estimates of the SEP flux on different timescales. Directly space-borne-measured SEP flux for recent decades is broadly consistent with estimates on longer timescales – up to millions of years, and the occurrence of extra-strong events is unlikely.

In general, the following main features are observed in the long-term evolution of solar magnetic activity.

• Solar activity is dominated by the 11-year Schwabe cycle on an interannual timescale. Some additional longer characteristic times can be found, including the Gleissberg secular cycle, de Vries/Suess cycle, and a quasi-cycle of 2000 – 2400 years. However, all these longer cycles are intermittent and cannot be regarded as strict phase-locked periodicities.

• One of the main features of long-term solar activity is that it contains an essential chaotic/stochastic component, which leads to irregular variations and makes solar-activity predictions impossible for a scale exceeding one solar cycle.

• The sun spends about 70% of its time at moderate magnetic activity levels, about 15 – 20% of its time in a grand minimum and about 10 – 15% in a grand maximum. Modern solar activity corresponds to a grand maximum.

• Grand minima are a typical but rare phenomena in solar behavior. Their occurrence appears not periodically, but rather as the result of a chaotic process within clusters separated by 2000 – 2500 years. Grand minima tend to be of two distinct types: short (Maunder-like) and longer (Sp¨orer-like).

• The modern level of solar activity (after the 1940s) is very high, corresponding to a grand maximum. Grand maxima are also rare and irregularly occurring events, though the exact rate of their occurrence is still a subject of debates. These observational features of the long-term behavior of solar activity have important implications, especially for the development of theoretical solar-dynamo models and for solar-terrestrial studies.

Why should The Team or The Match Committee bother to pay any attention to such information. We are now in the grand minimum certainty period of “post normal science” according to a cluster of geniuses and there is no place for anything but uncertainty because without uncertainty there could be no post normal science. Don’t you get that? (/sarc)

Not that I don’t agree or disagree with the paper but SSN is very subjective and using proxies is also subjective since the solar community can not agree on what is or is not a good proxy for solar activity. Also the graph looks a little too similar to the “Al Gore ” graph in that spoof film. Proxies also fall apart after the post atom bomb testing. If L an dP are correct then the SSN is not a good proxy but flux is a better way for looking at past solar activity. Seems a bit to convenient.

Figure 15 looks quite like a hockey stick to me. Where did the MWP go?
Of course they clearly did not use Leif’s reconstruction (the GSN curve).
And the other thing coming on my mind is how reliable the 10Be proxy might be after the era of atomic bomb tests.

My view is that we should be cautious about 10Be and 14C is accurate indicators of solar activity amplitudes over the longer term (500 years or more). However, the periodicity does seem to be a reliable indicator according to the observational record. It fits well with the hindcast created using our solar-planetary theory too.

Proponents of the dynamo only theory of solar activity changes don’t seem to have come up with any model which can be compared to these proxy records.

Most interesting – I shall read this one over the weekend, as 88 pages is a little much for reading over breakfast. It’s a pity we’ve only had radio for a century or so, as a radio enthusiast I’d love to know more about what our ionosphere does under different solar conditions, if the last century has been untypically active the next few should be interestingly different (and, one suspects, quieter, apart from all the human pollution of the EM spectrum).

I was under the impression that C14 has been heavily affected by the nuclear age because it is difficult to know how much of the C14 is from cosmic and how much man-made sources of radiation. So, before even reading I searched for the section on nuclear testing and how this error was accommodated. E.g. it may have said “man-made levels are negligible particularly now nuclear testing has stopped”.

I’d like to see what the graph looks like when extended backwards through the Younger Dryas ie back further than the Holocene. That might shed light on the hockey-stickish look to the graph.

IPCC use three hockey stick curves on page 3 of their Summary for Policymakers, all taken from ice core records – CO2, methane and nitrous oxide. But I strongly suspect these have not been adequately corrected for effects of slow compression of the firn over recent centuries and rapid decompression of the ice core on extraction – just as Jaworowsky and Segalstad claim. So my immediate reaction is to suspect the HS-look – as by the same term, I suspect the ice hockey sticks thrust in my face in the IPCC SfP.

It would be nice to believe we’ve had exceptional solar activity. But recent warmth appears to have been less than MWP warmth, which in turn appears to have been less than Roman WP warmth, which was less than the one before that… according to these ice core records. Unfortunately I did not note the proxy or proxies used for that study – but at least there is no significant hockey stick when put in context.

I’m sorry this article is highly misleading and is leading people to make comments which are ill-informed, without reading the following section the article about the quality of the reconstruction (not the quality of work I hasten to add which looks good) it is impossible to make an informed comment about the importance of this maximum.

It is quite literally a hockey stick, one the author has clearly signal, but one which for reasons I do not understand have not been flagged here.

3.2.4 The Suess effect and nuclear bomb tests
Unfortunately, cosmogenic 14C data cannot be easily used for the last century, primarily because of the extensive burning of fossil fuels. Since fossil fuels do not contain 14C, the produced CO2 dilutes the atmospheric 14CO2 concentration with respect to the pre-industrial epoch. Therefore, the measured Δ14C cannot be straightforwardly translated into the production rate 𝑄 after the late 19th century, and a special correction for fossil fuel burning is needed. This effect, known as the Suess effect (e.g., Suess, 1955) can be up to −25h in Δ14C in 1950 (Tans et al., 1979), which is an order of magnitude larger than the amplitude of the 11-year cycle of a few per mil.Moreover, while the cosmogenic production of 14C is roughly homogeneous over the globe and time, the use of fossil fuels is highly nonuniform (e.g., de Jong and Mook, 1982) both spatially (developed countries, in the northern hemisphere) and temporarily (WorldWars, Great Depression, industrialization, etc.). This makes it very difficult to perform an absolute normalization of the radiocarbon production to the direct measurements. Sophisticated numerical models (e.g., Sabine et al., 2004; Mikaloff Fletcher et al., 2006) aim to account for the Suess effect and make good progress. However, the results obtained indicate that the determination of the Suess effect does not yet reach the accuracy required for the precise modelling and reconstruction of the 14C production for the industrial epoch. As noted by Matsumoto et al. (2004), “. . . Not all is well with the current generation of ocean carbon cycle models. At the same time, this highlights the danger in simply using the available models to represent state-of-the-art modeling without considering the credibility of each model.” Note that the atmospheric concentration of another carbon isotope 13C is partly affected by land use, which has also been modified during the last century.

Another anthropogenic activity greatly disturbing the natural variability of 14C is related to the
atmospheric nuclear bomb tests actively performed in the 1960s. For example, the radiocarbon concentration nearly doubled in the early 1960s in the northern hemisphere after nuclear tests performed by the USSR and the USA in 1961 (Damon et al., 1978). On one hand, such sources of momentary spot injections of radioactive tracers (including 14C) provide a good opportunity to verify and calibrate the exchange parameters for different carbon -cycle reservoirs and circulation models (e.g., Bard et al., 1987; Sweeney et al., 2007). Thus, the present-day carbon cycle is more or-less known. On the other hand, the extensive additional production of isotopes during nuclear tests makes it hardly possible to use the 14C as a proxy for solar activity after the 1950s (Joos, 1994).
These anthropogenic effects do not allow one to make a straightforward link between preindustrial data and direct experiments performed during more recent decades. Therefore, the question of the absolute normalization of 14C model is still open (see, e.g., the discussion in Solanki et al., 2004, 2005; Muscheler et al., 2005).

Sorry in my rush to get this key information to you I said: “It is quite literally a hockey stick”. What I meant is its gluing two very different sets of data together – I didn’t mean that its a false upswing.

Skeptic’s view:
The cosmic rays count is impacted by the Earth’s magnetic field oscillations to the extent which could be equal or greater (by an order of magnitude) than that of the changes in the Heliospheric magnetic field. Scientists (NASA-JPL) only now trying to understand impact of Earth’s magnetic field changes.
At this point in time it is difficult to resolve differences which can be attributed to either of two quoting Dr. Jean Dickey of NASA’s Jet Propulsion Laboratory, Pasadena:One possibility is the movements of Earth’s core (where Earth’s magnetic field originates) might disturb Earth’s magnetic shielding of charged-particle (i.e., cosmic ray) fluxes . ……
My small but pioneering effort in that direction is shown here:http://www.vukcevic.talktalk.net/TMC.htm

Sorry my comments are in pieces. I clearly need to expand on what I’m saying:

If we are trying to assess cosmic ray flux by measuring a proxy, the amount of various isotopes, given the potential effect of atomic testing, we have to discount the possibility that some of the isotopes do not stem from man-made sources.

For Carbon two are suggested in the paper, the effect of nuclear testing and from burning coal which has not been exposed to recent cosmic rays so dilutes Carbon 14 in the atmosphere.

However, potentially similar problems exist with all isotopes. I do not personally now how much Beryllium or other elements could have been affected by nuclear testing & whether it may contaminate the results or whether mining and extraction may also have significantly affected the result. But even if the effect is negligible, in a paper dealing with a current maximum that coincides with problematic human activity that effect should have been discussed and then, & only then, discounted.

Reading the paper I find: “The global production rate of 10Be is about 0.02 – 0.03 atoms cm–2 s–1 (Masarik and Beer, 1999; Webber et al., 2007; Kovaltsov and Usoskin, 2010), which is lower than that for 14C by two orders of magnitude (about 2 atoms cm–2 s–1; see Section 3.2.2).”

So, we are talking about very low levels which may be more easily contaminated. Searching for nuclear testing and Beryllium quickly revealed that Beryllium was used in weaponry:

“Once the critical mass is assembled, at maximum density, a burst of neutrons is supplied to start as many chain reactions as possible. Early weapons used an “urchin” inside the pit containing polonium-210 and beryllium separated by a thin barrier.”

There is no way I can say in the many papers cited about Beryllium, no author hasn’t dealt with this question thoroughly without reading the lot, but it seems to me that if you are suggesting we are at a solar maximum, you have to demonstrate that feasible alternative explanations can be discounted. This author has not done this in this paper. It may be “common knowledge” in his subject that Beryllium and any other proxies you care to use are not affected by modern human influence, but we have seen where “common knowledge” about the “problems” of warming have got us.

Judging by the first graph, it looks like we’re repeating 9000 BC. It also looks like we’re about to drop down to a low number again, just as happened after that peak.

Aside from the obvious Svensmark weather connection, there are also some pretty good connections to human civilization. Organized agriculture developed after that 9000 BC peak went away, and in recent times the highest peaks correspond to times of widespread war and revolution. Clearly humans are more stable when we get more rain. Might be a more direct connection too… Major spots mess with electrical activity, and our nervous systems have a lot of electrical activity.

Re: Skeptic’s view
Speculative ?
As already posted: The cosmic rays count is impacted by the Earth’s magnetic field oscillations to the extent which could be equal or greater (by an order of magnitude) than that of the changes in the Heliospheric magnetic field.
Here I show that the 400 years bi-decadal changes in the Earth’s magnetic field-GMF (by an order of magnitude larger) closely mirrors the solar magnetic field’s trend. http://www.vukcevic.talktalk.net/GSNvsGMF.htm
There is no data base of a sufficiently high resolution for period before 1600.
Conclusion must be is that cosmic rays count is reflection in change of the GMF rather than the GSN variability.
Dr. Jean Dickey of NASA’s Jet Propulsion Laboratory, Pasadena:One possibility is the movements of Earth’s core (where Earth’s magnetic field originates) might disturb Earth’s magnetic shielding of charged-particle (i.e., cosmic ray) fluxes . ……

The graphs depicted do tend to line up with historic records and many other proxy studies of hot and cold periods from pre history. Thus it is not hard to believe that the sun does warm the earth and it’s bad behaviour can have some influence on our climate.

Re-thinking the magic qualities of CO2 may be appropriate for erstwhile climatologists at this time in our history. Failure to do so will make them seem less than scientific in their chosen field.

Did anyone ever assess the sun spot count? Can we be sure enough that the number of sun spots counted in say 1700 were accurate or is it possible that due to lower resolution they were counting lower numbers?

The authors of this paper are dills. They have arbitrarily used a constant sunspot number (based upon Be10 and C14 measurements) to define Grand Minima and Maxima. This assumes that it is not possible to have a Grand Minimum during a period of above average solar activity or a Grand Maximum during a period of below average solar activity.

In other words, they assume that short term variations in solar activity (~ centuries) have to be linked to long term variations in solar activity ( ~ millennial). The two phenomenon may not be directly related.

I think Leif S have made an exeption when it comes to the method measuring BE10 and didnt sign off Shapiros reconstruction on the similaf basis/methód. To compere uncertainties and error bars on the different reconstruction methods is essential and just counting sunspots is a very unprecise and “buckshot” like. The human inpact on C14 is known but hard to qantify. Be10 seems to be much less influensed contrary to what “warmist” try to argue. But this paper is very intresting and ofcorse not popular in main stream CAGW.

DirkH says:
September 14, 2012 at 4:14 amRe fig 15: Probably Leif will tell us that modern sunspot counts have to be reduced by about 20%. That would, IMHO, make the graph more plausible, as we would be closer now to MWP levels in that case.
Not only that but the group sunspot number is just plainly wrong. Progress has happened since 2010. Here is the current status [btw Usoskin is member of our team too]:http://www.leif.org/research/Reconstruction%20of%20Sunspot%20Number.pdf

14 Sept: UK Telegraph: Nick Collins: Met Office better placed to predict ‘big freezes’
Predicting a “mild winter” before the harshest conditions in 30 years was hardly the Met Office’s proudest moment.
While the snow, ice and temperatures of -22C were impossible to anticipate at the time, an upgraded forecasting model could have predicted what was coming, experts said.
In 2009-10, Britain was hit by two bursts of Arctic conditions which resulted in an average winter temperature of 1.5C (35F), well below the 30-year average of 3.7C (39F).
The failure of forecasters to foresee the icy conditions, months after their “barbecue summer” proved to be a washout, was widely interpreted as the main reason for the agency’s decision to stop releasing long-range forecasts…
A new study by the agency found that the unexpected nature of the “deep freeze” lay in the inability of its seasonal forecasting equipment to simulate phenomena known as sudden stratospheric warmings (SSWs)…
However, they added that the technology would not lead to perfect predictions. While cold winters caused by the wind changes were now more predictable, other combinations of conditions could still cause freak cold spells.
In the new study, published in the Environmental Research Letters journal, the team compares forecasts made ahead of the “deep freeze” against a new, retrospective prediction based on the same data but using the new technology…http://www.telegraph.co.uk/science/science-news/9541307/Met-Office-better-placed-to-predict-big-freezes.html

I appreciate all the implications of your longer explanation. However,

“There is no way I can say in the many papers cited about Beryllium, no author hasn’t dealt with this question thoroughly without reading the lot, but it seems to me that if you are suggesting we are at a solar maximum, you have to demonstrate that feasible alternative explanations can be discounted. This author has not done this in this paper.”

The contamination possiblity is real, the numbers are indeed very low. My reading of the paper suggests that the claim to be in a solar grand maximum is based on the subspot count, not the 10Be level. If the 10Be supports the sunspot count, then this is one indication that the contamination level is low. It may be that from 1945 onwards, there is an issue with using 10Be but we have excellent sunspot counts instead, which is in any case a direct proxy. I admit I have not read of any use of 10Be collected in the past 100 years or so. If it has been done, any contamination would easily be detectable because the dates of nuclear explosions are well known, and they would have to correlate exactly with other isotopes from the tests.

10Be as an indirect proxy has certainly been looked at for ages and seems to give pretty reliable correlation (inverse relationship) to temperature, not so? One of the interesting graphs that convinced me the ‘problem’ was solar and no CO2 was a 520m yr plot of temperature, 10Be and CO2. There is an obvious inverse correlation between 10Be and temperature, with CO2 wandering all over the place showing no obvious correlation to either of the other two.

I don’t say that 1/10Be : Temp is a no-brainer, but it is a heck of a lot better than CO2 : Temp in spite of what is claimed for ice cores (delay or no delay). The shorter term charts (10k yr) seem to indicate a good correlation of Temp : Co2 with a delay for the CO2, but looking closely at the ice core data there are clear, repeated inverse relationship moments which require an explanation. Have a look at the ice core data at the temperature inflection points. Definitely not in keeping with the standard GHG explanation.

Reading the comments, (I have not read the ‘meat’ of the paper – just what is posted here), many people have reservations about the values given for recent years. This seems to be a problem stemming from the title of the post – which refers to the last few years as being a grand maximum – which is focussing attention on the last millenium.

Given that Mike Hasseler has noted that the paper needed to use a different calculation for recent years (thus his reference to a hockey stick – which is a nice one), it does seem that the author is aware of the problems and their own title doesn’t refer to a grand maximum in the last century.

Since the paper was actually published in 2010, I suspect that it has been widely reviewed and critiqued by the solar activity people and – almost certainly – Leif. Has anyone had a chance to dig around for these reviews?

Mike Haseler says:
September 14, 2012 at 1:45 am
I was under the impression that C14 has been heavily affected by the nuclear age
============
Funny, when I was a kid we were told that the climate was being heavily affected by nuclear testing.

After the nuclear testing stopped, scientists had to find another excuse to explain their failed predictions. The one explanation they never seem to consider is that just maybe the fault doesn’t lie with other people, that it is the scientists themselves that do not understand climate.

I don’t see how the discussion can so quickly turn to Earth’s temperature. It’s own variability, short term and long term, produces such fluctuating noise that any Solar affects will be buried in it. I prefer to stick to the paper at hand in our discussion. Confounding it with Earth’s temperature just messes up the flow.

I am surprised to see this old paper as a new topic on WUWT, but it has some useful data. The authors use the INTCAL98 data that I also used in my paper as a reasonably good record of the repeating nature of grand minima, but they make two rather large mistakes. The record after 1950 is very suspect and their blue line depicting grand minima is way too low.

REPLY: I missed the paper the first time around, so it is “news” to me. – Anthony

Thanks for the link. The paper addresses the issue you raise before about wind-blown 10Be and puts a tentative figure on it (30%).

“…they do not indicate unusually high…” Well, yeah, but they also indicate ‘pretty high’ values, you could even say, ‘equal to the highest’ values present in the record. Looking at Figure 1 it is pretty clear your definition of ‘unusually high’ must mean ‘pretty goldarned high’ because the 10Be concentration drops a lot.

With reference to the paper above, there is certainly support in doi:10.1029/2009GL038004, 2009 for the idea that solar activity has been ‘higher lately’ than before (say 120 years before) and I concluded that the warming observed from 1975-1998 has been almost entirely caused by this. Given the quite separate demonstrations that there are significant negative temperature feedback mechanisms in the atmosphere, the attribution of global warming during that recent period is unlikely to have been caused principally by fossil-fuel sourced anthropogenic emissions of CO2 and land use changes, p = <0.05. Until the temperatures are corrected for solar activity, the AG component will remain undetectable. As the IPCC denies any meaningful solar component, don't expect much from AR5.

PS Where did I get that p value? I made it up! It is my opinion. After all, this is climate! Dr Bill Mollison told me that 87% of statistics were made up and I still believe him. :~)

The 2008/2010 paper is written as a survey paper and it provides a lot of background on the isotope proxies, the process of deposition, and factors that have to be accounted to use them accurately, but at least as far as Usoskin’s own work, this background is not affecting his results, but is just giving a fuller picture of how they were arrived at.

In particular, both papers list the modern grand maximum as 80 years in duration, centered on 1960, and neither paper makes any ridiculous claims that late 20th century warming couldn’t have been caused by the extraordinarily high level of solar activity because solar activity did not KEEP going up (something Usoskin and his co-author Solanki have both done elsewhere, like at the end of the abstract of this 2005 paper).

Solanki et al., 2004. “Although the rarity of the current episode of high average sunspot numbers may indicate that the Sun has contributed to the unusual climate change during the twentieth century, we point out that solar variability is unlikely to have been the dominant cause of the strong warming during the past three decades.”

Thanks to Sparks for citing yet another paper where Usoskin and Solanki claim that it is the rate of change in solar activity that causes warming, not the level of solar activity. You know, like if you want to heat a pot of water on a stove you have to turn the flame up sloooooowly. If you just set the flame on maximum and leave it there, the pot won’t heat. Everybody knows that.

To be specific, the 2004 paper that Sparks cites for the Solanki-Usokin conclusion that the sun can’t have caused recent warming cites in turn for this conclusion a 2004 paper by Solanki and Krivova. Here is excerpt from the Max Planck Institute’s summary of the Solanki-Krivova findings:

However, it is also clear that since about 1980, while the total solar radiation, its ultraviolet component, and the cosmic ray intensity all exhibit the 11-year solar periodicity, there has otherwise been no significant increase in their values. In contrast, the Earth has warmed up considerably within this time period. This means that the Sun is not the cause of the present global warming.

The sun was at maximum levels, but those levels were not continuing to rise, hence they could not have caused warming. “Post-normal science” at work.

“In contrast, the Earth has warmed up considerably within this time period. This means that the Sun is not the cause of the present global warming.”

That was of course before they took cloud cover into consideration and still approaches solar influence as being one of varying wattage, not varying insulation or shade. The NOAA conclusion is Fred Flintstone Science.

Jim G says:
September 14, 2012 at 9:45 amSo, Leif, what is your overall evaluation of this paper, conclusions, etc., if you have developed such at this point?
The high values the past hundred years are caused by the use of the Group Sunspot Number. Just about everybody working on long-term solar activity are participating in two workshops I am leading [with a number of co-conveners]: The sunspot workshops http://ssnworkshop.wikia.com/wiki/Home and the solar activity workshop http://www.leif.org/research/Svalgaard_ISSI_Proposal_Base.pdf
The workshops are ongoing. A preliminary summary/discussion is supplied by Hugh Hudson: http://www.leif.org/research/SSN/Hudson.pdf
One of the conclusions we are coming to is to “Reject the Group Sunspot Number approach”

RE: One of the conclusions we are coming to is to “Reject the Group Sunspot Number approach”
Will the sunspot number be calculated by individual sunspots by satellite?

The reason for using that formula in the past is for observing conditions that are less than ideal and where small spots are hard to see from ground based observatories.
Will small spots that were hard to see from ground based observatories be rejected from satellite data and or be added to the past as a new formula?

Thanks. Your links, in themselves, a very interesting read. Apparently a “work in progress” at this time. I see that, as per our last discussion, we go back to the adjustment of he older “raw” SSN’s using the “group” number.

I would take this paper with a large grain of salt. Note that the graphs show quite low solar activity corresponding to the Medieval Warm Period; so how do we explain that? Similarly for the Roman Warm Period.

Be-10 is generally produced by cosmic ray bombardment of N14 or O16. It is also the result bombardment of C13 by lower energy neutrons during H-bomb tests. The important point about the tests is that open-air tests took place from about 1950 to 1980. That is an extremely limited temporal window. Unlike (presumably) C-14, Be-10 is rapidly scavenged from the atmosphere by rainfall. If you have followed the climate free-for-all, there is some question about the residency time of CO2 in the atmosphere.

Alec Rawls says:
September 14, 2012 at 9:23 amIf you just set the flame on maximum and leave it there, the pot won’t heat. Everybody knows that.
I don’t know that. When I start the pot in the mornig on maximum in order to get hot water for my tea and to boil my eggs, it works great for me. I get hot tea and boiled eggs in minimum time. If I turn down the heat, it takes longer…

Sparks says:
September 14, 2012 at 1:01 pmWill small spots that were hard to see from ground based observatories be rejected from satellite data and or be added to the past as a new formula?
The issue is not the difference between ground-based and satellite data [the solar-observing satellites have rather smnall telescopes – a weight issue and a heating issue]. There are stong indications that we are losing the small spots so that the sunspot number may soon become a poor measure of solar activity. One of our goals is to decide how [and what] to count in the future.

Jim G says:
September 14, 2012 at 1:42 pmI see that, as per our last discussion, we go back to the adjustment of the older “raw” SSN’s using the “group” number.
Actually, not. We instead abadon the older group numbers. The calibration [‘adjustment’ as you call it – which is a bad terminology as we are calibratinf against a well-observed, objective, and well-understood independent measurement: the daily variation of the geomagnetic field [discovered in 1722].

Thanks Leif, that cleared-up one or two thoughts I had, you’re constructively thinking ahead, I like that! when this weak cycle comes to an end, every possible observation technique and experiment should be made available and implemented for this event.

Thanks to Sparks for citing yet another paper where Usoskin and Solanki claim that it is the rate of change in solar activity that causes warming, not the level of solar activity. You know, like if you want to heat a pot of water on a stove you have to turn the flame up sloooooowly. If you just set the flame on maximum and leave it there, the pot won’t heat. Everybody knows that.

Jim G says:
September 14, 2012 at 1:42 pm
I see that, as per our last discussion, we go back to the adjustment of the older “raw” SSN’s using the “group” number.
Actually, not. We instead abadon the older group numbers. The calibration [‘adjustment’ as you call it – which is a bad terminology as we are calibratinf against a well-observed, objective, and well-understood independent measurement: the daily variation of the geomagnetic field [discovered in 1722].

This comment reveals the core of how post-modern politicised science works in the 21st century.

All data, of any kind in any field, is reassessed through a political filter. In politicised subjects such as climate, radiation carcinogenesis etc, an opinion is established (we have the Soviet Union to thank for establishing this mind-set and practice that is now universal and dominant in the European and American Scientific elites).

Then data is assessed by the filter of the established opinion and result. If it is found to be at variance, the first question is how to direct personal attacks against the scientist in question. In parallel with this, strategies and narratives are created to discredit the data.

This is why every single time data is found that identifies climate drivers other than the inquisition-orthodox CO2, then huge creative energy is expended – not to mention tax-payers money – to discredit (or give the appearence of discrediting) the heretical data.

It is imperative that climate skeptics and any who are concerned with scientific truth make as many copies as possible of the important data in every scientific field, especially climate, and keep these copies secure for the future, since the nights of public book-burning are not very far away.

I believe in carbon dioxide, maker of heaven and earth
and in greenhouse radiative warming, his only begotten son
who was conceived of the Holy Arrhenius, and born of Charles David Keeling
suffered under the axis of evil oil producing companies,
he descended to the madlands of skeptic hell.
on the third day he rose victorious therefrom
he ascended into heaven and sitteth at the right hand of Michael Mann
From thence he shall come to judge the heretic and the climate-deviant,
I believe in climate computer models,
the holy catholic team, the communion of software fudgers and decline-hiders,
the forgiveness of deceit, concealment, victimisation, murder, whitewashing, obscuring and deleting factual data in the name of CO2,
the resurrection of the dark ages,
and in uniform unchangeing climate and CO2 concentration in all times past,
280 ppm without end
Amen.

phlogiston says:
September 15, 2012 at 3:12 pmThis comment reveals the core of how post-modern politicised science works in the 21st century. …
Then data is assessed by the filter of the established opinion and result.
You could not be more wrong. The re-assessment of the solar activity record is done against the established opinion and overthrows old dogma and is not politicized in any way, quite the contrary.

“The re-assessment of the solar activity record is done against the established opinion and overthrows old dogma and is not politicized in any way, quite the contrary.”

I am involved in a long drawn out re-establishment, I could say, of a certain field of science in which there are literally thousands of lab tests performed according to a certain protocol and which give a certain set of merics and claimed results. That is, the method of the analysis yields particular metrics. When I pointed out there were numerous manthematical errors in the calculations and errors of concept in the whole experiement, people first ignored then debated then accepted the issues were real, then drafted a new protocol. I thought, “Great. We will have a lab test that predicts field performance.”

The new protocol looks at pretty much the same data in pretty much the same way but makes a few less errors . When I challenged this result as pretty pointless because the tests still give nearly no useful information, I was informed that the new data has to be reported in a way that the old data can still be ‘converted’ to the new methods. This was the basis of their new science.

I pointed out that there was no point (at all) in translating old data that was based on or had as many as 33 errors so the ‘conversion’ from garbage to part-garbage is meaningless. Well the group pretty much ignored the observation. They have to have something to be able to make use of all that old data, apparently. And people are used to the old test method.

It seems to me you are in the same position. Just because it is being done openly there is still the obvious need to have all the old sunspot data ‘converted’ to some or other new system. Perhaps your result will be much better than the junk I have to deal with, but there is a lesson here: Make a no-holds-barred assessment of what constitutes meaningful metrics for solar activity and magnetohydrodynamics or whatever it is you need to quantify, and make no nod to the past. After working out how and what has to be done, then start to see if anything can be salvaged from old geezers looking through 4x telescopes at the sun centuries ago. If it is all but useless, tough. At some point the old junk has to go and if it turns out someone clever can make sense of it later by some as-yet-undetermined proxy, fine. But the risks posed by yet another set of partly effective methods are great. The politics is then reduced to managing the self-promoters or trouble makers.

Crispin in Waterloo says:
September 16, 2012 at 6:43 pmThe politics is then reduced to managing the self-promoters or trouble makers.
Good luck with your quest.
Luckily, everybody in the sunspot community is on-board for this re-assessment and we are making good progress. There are only a few trouble makers. The biggest problem is our ‘users’, the people that use the sunspot numbers and solar activity indices. They do not want any improvements if these upset their pet theories and correlations. They talk about ‘ironing boards’ and other assorted nonsense. Our solution to that is to for all sunspot-counters and index-producers to stand together, then after a while the recalcitrant users will begin to look silly [some are already in that boat – to wit some of the hand-wringers on this very blog]] and quietly be converted. We can also use the ‘name and shame’ mechanism which can be quite effective.

I just read the PPT and it is still pretty bare without the discussion. I will try to keep up.

I have found that the idea of embrassing people into sensible positions is pretty infective and it has great dangers attached in that if you get one thing wrong and they get that one right, your will never live down the hue and cry that you are convering up or have another agenda. Look at all the pigeons that are coming home to CAGW crowd. They have been trying to use personal attacks as a consensus management tool for years. The science moves ahead faster than the evolution of what are ostensibly unity building maneuvers.

It is not all that different from the situation I face, actually. The raw data is mostly agreed (how to get it) but after than, there is chaos. If someone manages to notice something based on a strange interpretation of the data, fine – maybe it leads to something, maybe not. I am open minded. But 6 x 8 must still = 48, if you get my drift.

I appreciate the call for using less processed data and encouraging people to use the raw numbers. There is so much lost in the repreated conversions and smoothing. I use a data analysis method that requires all the raw data to be included on the page and all editing to be indicated with a colour change (We use large spreadsheets). I have no qualms about smoothing, but the raw data and the smoothed must both be available to anyone looking at it so they can try their own (prejudiced) hand if they wish. As is so clear in ‘climate science’, hiding how the result was obtained is a huge problem for people trying to think, instead of ‘accept’. Plus, they keep getting caught cheating.

I hope this warming castrophist nonsense will soon be over – it is more wasteful of resources than a major war. But I fear they will just lapse into the global cooling scare that was all the rage when I was young. Some thing never change I guess. Well, sunspots do…

Crispin in Waterloo says:
September 16, 2012 at 7:56 pmif you get one thing wrong and they get that one right
We shall work so carefully that there will not be one thing wrong. This is not rocket science.

u.k.(us) says:
September 16, 2012 at 8:08 pm“We can also use the ‘name and shame’ mechanism which can be quite effective.”
Sure you want to go here ?
Yes, although it is true that you catch more flies with honey than with vinegar, but when you have caught one, you squash it. As Hugh Hudson puts it [slide 10 of http://www.leif.org/research/SSN/Hudson.pdf ] :
This group needs to take charge of the perception of SSN:
– Consensus
– Public databases and ample publications
– Propaganda that discredits any research not using the consensus SSN

Leif Svalgaard says:
September 14, 2012 at 11:58 am
“One of the conclusions we are coming to is to “Reject the Group Sunspot Number approach”

Good for you guys! I know little of this area but as an auditor I had to shake my head at how those were counted and have thought I sure would like to hear the explanation for that approach! One has to appreciate consistency but it would nice to be able to unravel that code sometime in the future.

Counting spots especially with the group number seems to be a rather crude measure and I would assume in modern times measurements of the geomagnetic field is much more objective. I see your statement the geomagnetic field variation was discovered in 1722. How confident are you in proper measuring of it through time?

Bill Hunter says:
September 16, 2012 at 10:07 pmI see your statement the geomagnetic field variation was discovered in 1722. How confident are you in proper measuring of it through time?
From the 1740s it was precise enough for this purpose. An angle of about 10 arc minutes is measured and that was well within the capabilities of that time.

Leif Svalgaard says:
September 16, 2012 at 8:44 pm
u.k.(us) says:
September 16, 2012 at 8:08 pm
“We can also use the ‘name and shame’ mechanism which can be quite effective.”
Sure you want to go here ?
Yes, although it is true that you catch more flies with honey than with vinegar, but when you have caught one, you squash it. As Hugh Hudson puts it [slide 10 of http://www.leif.org/research/SSN/Hudson.pdf ] :
This group needs to take charge of the perception of SSN:
– Consensus
– Public databases and ample publications
– Propaganda that discredits any research not using the consensus SSN
Severe [but necessary] gate keeping if you will.

Hey Leif, these lines, especially those at the bottom makes my BS-meter beep.
What confuses me is, if there is no politicisation why is there need of such consensus and push and propaganda? Severe gate-keeping?
I can understand you have a new version of interpreting the old sun spots counting. OK this is the new “consensus” sunspot graphic. We know what is pressure to agree in a group. Group decisions are not always best science.
We can no longer count the way people 100, 200 years ago did? If the new method is better it will be anyhow validated with the time, what for the propaganda, the push and the gate keeping?
What for these? For the pure love for science? What are you afraid of?

Lars P. says:
September 17, 2012 at 12:03 pmHey Leif, these lines, especially those at the bottom makes my BS-meter beep.
Perhaps that BS-meter is badly in need of adjustment too? :-)

What confuses me is, if there is no politicisation why is there need of such consensus and push and propaganda?
Because there are several data series out there which are different enough that it makes a difference which one you use. People then cherry-pick the one that fits their pet theory the best. This is poor science [or not science at all]. It behooves the folks responsible for producing the sunspot series to get their house in order and agree [if possible] on what is the ‘best’ according to the data we have. This is simply what we are trying to do.

Group decisions are not always best science.
In this case it will be as all data, methods, calibrations, etc will be thoroughly vetted by all members of the group.

We can no longer count the way people 100, 200 years ago did?
Of course we can. We even have the original instruments used by Wolf and successors, and they are being used to this day.

What are you afraid of?
See above. A correct measure of the long-term variation of solar activity has many practical uses, e.g. in the prediction of the next cycle, as a benchmark for theories to match, etc.

Leif, thanks for the answer, but no adjustments to my BS-meter (thanks for the good laugh!) it is tested and works ok for 97% (75 out of 77 cases) – but it was not wrong since long, so maybe ….

“Because there are several data series out there which are different enough that it makes a difference which one you use.”
Well – it is either an imprecision of measurements – so in the error range – or wrong measurement or it is interpretation of the data. If there is interpretation of the data what we talk about, then it is a different story.

In this case it will be as all data, methods, calibrations, etc will be thoroughly vetted by all members of the group.
How does it work? If one person says 1% more here (because of…) and the other one 2% less (due to …) if they do not agree, is the result the average, weighted average or they discuss until one gives up? If there is one who does not agree is the error range increased or is he simply ignored?
But I have the impression that the results are already known from what was shown already in the thread. So what is the group discussing actually?

Of course we can. We even have the original instruments used by Wolf and successors, and they are being used to this day.
If we can then why do we need any adjustments? Why don’t we keep the same methods and measurements to keep the validity of the data? Pardon my ignorance – is this what you are trying to do?

See above. A correct measure of the long-term variation of solar activity has many practical uses, e.g. in the prediction of the next cycle, as a benchmark for theories to match, etc
Yes.
This does not justify propaganda to discredit any other interpretation. If its a matter of interpretation and not precise measurement it is a grey area where the other side may be right. I do not think in science there is any need of propaganda to discredit other research based on different data interpretation.
To my understanding this is not helping the progress of science. Quite the contrary propaganda was used to support the theory in power delaying the time until it could be challenged. Fringe theories either will be with the time discredited, or rarely gain slowly traction and are adopted.

Lars P. says:
September 18, 2012 at 1:46 pmIf there is interpretation of the data what we talk about, then it is a different story.
The data are what they are and cannot be changed and don’t have errors [if person P says see saw S spots, that is what he saw]. The problem is that no single person have observed for the past 400 years, so we have to ‘harmonize’ the different observers by referring his data to a chosen standard observer.

How does it work?
Basically the data from one observer is plotted against a standard observer. Experience shows that such plots are linear and it comes down to determined the slope of the line. There are standard ways of doing that, so no discussion or disagreement is needed or possible. We have to have everybody to sign off on the result. If there is dissent, the dissenter must show where everybody else is wrong.

But I have the impression that the results are already known from what was shown already in the thread. So what is the group discussing actually?
Not everybody [in fact only a few] has gone through ALL the details for themselves. The workshop is the means to FORCE everybody to work through ALL the data.

If we can then why do we need any adjustments? Why don’t we keep the same methods and measurements to keep the validity of the data? Pardon my ignorance – is this what you are trying to do?
The difficulty is that there is no overlap between modern observers and the early ones. What we can do is to compare the modern data to the objective measure of the UV that is afforded by the geomagnetic measurements. And to teach everybody about the geomagnetic method [which is mostly voodoo to solar observers]

If its a matter of interpretation and not precise measurement it is a grey area where the other side may be right. I do not think in science there is any need of propaganda to discredit other research based on different data interpretation.
Interpretation is too weak a word for what we are doing. ‘Calibration’ is the better word. Calibration is an objective procedure and if people do not take the trouble to understand [and follow] the whole process, then their opinion should be discredited.

To my understanding this is not helping the progress of science. Quite the contrary propaganda was used to support the theory in power delaying the time until it could be challenged. Fringe theories either will be with the time discredited, or rarely gain slowly traction and are adopted.
I hope that your understanding is now improved. What we are doing will greatly help the progress of science. To wit that everyone in this business agrees that it will. The problem is not fringe theories or suppressing of data, but to educate our ‘users’ who as long as there are several ‘semi-official’ version will cherry pick the one [right or wrong – doesn’t matter to them] that supports their own pet theories. This is what we are working on to avoid.

However, according to the IPCC, none of this has nothing to do with 0.7C of global warming since the end of the Little Ice Age in 1850

I notice that the MWP period of sunspots is quite a bit lower than the modern period according the the chart there. So, if the IPCC is wrong, and the sun is the primary influence on climate, should we agree that the MWP was cooler than today? Or do we invoke some other forcing that is stronger than solar which made the MWP warmer?

According to the chart, 11, 000 years ago solar activity was as high as the latter part of the 20th century. But if the proxies for this come (mainly?) from the Northern Hemisphere, isn’t that more a record of orbital dynamics, which saw increased insolation over the Northern Hemisphere at that time?