The logic(?) of the IPCC’s attribution statement

How can the IPCC increase their confidence in anthropogenic global warming at the same time their model projections are diverging farther and farther from reality? – John Nielsen-Gammon

This question is asked in a blog post by John Nielsen-Gammon entitled Your logic escapes me. Interestingly, the logic that escapes him is the argument used in my recent Senate testimony. Excerpts:

Yet, we’re in the middle (or perhaps the end, or perhaps the beginning) of a hiatus in the rise of global temperatures. The evidence seems to be mounting that natural variability is more important than the IPCC reports had previously contemplated, yet the IPCC’s confidence in anthropogenic global warming grows stronger.

Now, I don’t know the IPCC’s reasoning process, to the extent that a complex organization can even be said to have a reasoning process. However, those who think that the IPCC is acting unscientifically in this particular matter are failing to recognize something particularly important: despite the growing divergence, the evidence supports increased confidence in the IPCC’s statement.

Nielsen-Gammon says he has no disagreement with these individual points from my testimony:

Lack of warming since 1998 and the growing discrepancies between observations and climate model projections

Evidence that sea level rise during 1920-1950 is of the same magnitude as in 1993-2012

Increasing Antarctic sea ice extent”

Nielsen-Gammon further agrees with the following statements from my testimony:

“Multiple lines of evidence presented in the IPCC AR5 WG1 report suggest that the case for anthropogenic warming is weaker than the previous assessment AR4 in 2007. Anthropogenic global warming is a proposed theory whose basic mechanism is well understood, but whose magnitude is highly uncertain. The growing evidence that climate models are too sensitive to CO2 has implications for the attribution of late 20th century warming and projections of 21st century climate.”

“If the recent warming hiatus is caused by natural variability, then this raises the question as to what extent the warming between 1975 and 2000 can also be explained by natural climate variability.”

Nielsen-Gammon then goes into a relatively lengthy argument regarding the possible importance of natural internal variability, which I summarize briefly here.

If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero. The warming during part of the cycle is cancelled by cooling during the other part of the cycle. What’s left is the long-term rise caused by man.

Why does the IPCC conclude that the long-term rise is caused by man? The primary logic is simple, really. Of all the things driving long-term changes in the climate system, the biggest by far over the past 60 years is greenhouse gases. Second on the list is particle pollution, or aerosols, which partly counteract the greenhouse gases. Over the past 60 years, natural forcings (sun, volcanoes) have also had a cooling effect. So arguments over the relative importance of different kinds of forcing don’t really matter for explaining the past 60 years of temperature rise: the only large one on the positive side of the ledger is greenhouse gases.

Of course, it’s not enough to say that greenhouse gases point temperature in the right direction. The magnitudes have to match, also. Here, too, the hiatus increases confidence that there’s not some unknown but significant positive forcing agent other than greenhouse gases that’s driving temperature. The smaller the rate of warming, the smaller the possibility that a separate, additional cause of warming is being missed, and that, therefore, greenhouse gases account for most or all of the total amount of warming.

Curry is correct that the hiatus and increasing model-observation divergence is evidence that the models are too sensitive to forcing agents, including carbon dioxide. She is also correct that, the stronger and longer-lasting the divergence, the greater the evidence that much of the warming pre-2000 was natural. Likewise, the stronger the evidence that models might be overestimating future warming. Exploring those issues requires evaluation of all the available evidence, not just the last 60 years of temperatures, and will have to wait for another blog entry.

But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part. Thus the IPCC can and should consider it to be extremely likely that human influence dominates the net rise in temperature over the past 60 years.

If you’re a believer in strong natural variability, and you’re looking to criticize the IPCC, you might complain that they don’t reduce their estimates of climate sensitivity enough, or that they don’t adequately discuss the increased evidence of the importance of natural variability in affecting temperatures from decade to decade. But none of this, and none of the evidence presented by Curry or by anyone else I’ve seen, reduces the likelihood that the temperature change over the past 60 years or so was mostly anthropogenic.

JC comment: A quick comment on the above argument relative to the stadium wave. First, the stadium wave is one element of natural internal variability, focused specifically on multi-decadal time scales (it says nothing about natural internal variability on longer timescales). Even on decadal to multi-decadal timescales, there is additional natural variability not captured by the stadium wave, notably the NAO/AO (which may not be a purely internal mode).

Reasoning about climate uncertainty

Nielsen-Gammon and I agree on the first order evidence that I presented in my testimony. How is that we then disagree regarding the IPCC’s attribution statement for warming since 1950? This statement by N-G sums it up:

Now, I don’t know the IPCC’s reasoning process, to the extent that a complex organization can even be said to have a reasoning process.

The IAC review of the IPCC argued for transparency in the reasoning that went into the confidence assessment. I agree with N-G that the IPCC’s reasoning is not transparent.

So back to the original question, how can different individuals or organizations look at the same primary evidence and draw different conclusions? Insight in this is given in my 2011 paper published in a special issue on Climatic Change entitled Reasoning about Climate Uncertainty, which can be summed up in one sentence:

How to reason about uncertainties in the complex climate system and its computer simulations is not simple or obvious.

JC’s reasoning about attribution

The logic behind my reasoning is this.

The way the IPCC’s attribution argument is laid out, there are two possible contributing causes to climate change since 1950: anthropogenic forcing, and natural variability. The sum of these two contributing causes is 100%; the chief issue of interest is the relative percentage contributions of anthropogenic forcing and natural variability.

The IPCC AR5 makes an extremely confident statement that ‘most’ of the warming is attributed anthropogenic forcing, and I understand ‘most’ to cover a range of 51-95%. The IPCC implicitly recognizes that the attribution issue is uncertain by not giving a distribution of values or a ‘best estimate.’ Rather, they provide a bounded region that covers ~44% of the possible territory.

In the absence of a compelling theoretical reason for the lower bound of 51%, I infer that the IPCC does not regard a high likelihood for values of anthropogenic forcing less than 60%; otherwise it would be difficult to argue that the lower bound would not be breached. At the time of the AR4, in talking with IPCC lead authors I had the sense that most of the scientists thought that the anthropogenic contribution was >90% (this is my subjective assessment; I would be interested in other documentation on this).

So for the sake of starting somewhere in my argument, lets start with this breakdown for the IPCC AR5 statement on attribution since 1950:

anthropogenic: 75%

natural: 25%

The elements of greatest certainty in this argument are the components of anthropogenic forcing. The elements of greatest uncertainty are the sensitivity to CO2 (and the fast thermodynamic feedbacks), natural internal variability, and then there are also the unknowns such as solar indirect effects.

The uncertainty in sensitivity to CO2 forcing is acknowledged by the IPCC by the lowering of the bottom bound in their ‘likely’ range; this points to lowering the proportion of anthropogenic component of the attribution. The ‘pause’ is raising awareness of the importance of natural internal variability. Other data reflects the importance of natural internal variability, which N-G didn’t pick up on as being relevant: the importance of natural variability in 20th century sea level rise as indicated by the mid-century bump; and the importance of natural internal variability in determining both arctic and antarctic sea ice extent. With regards to arctic sea ice decline, I spotted this statement in AR5 Chapter 10:

Comparing trends from the CCSM4 ensemble to observed trends suggests that internal variability could account for approximately half of the observed 1979–2005 September Arctic sea ice extent loss.

So . . . this evidence all acts to lower the contribution of anthropogenic forcing, and increase the contribution of natural variability, and I would argue that this lowering should not be regarded as trivial. Yes the 44% provides lots of wiggle room, but it is not unreasonable to infer, based on the evidence provided by the IPCC that the anthropogenic component has dropped below 70% and even below 60%. In this light, putting an extremely confident lower bound of 51% seems insupportable. Particularly in light of the unknowns such as solar indirect effects.

N-G’s reasoning about how to think about the relative contributions of natural vs anthropogenic anthropogenically is something like the way I have approached this, but not quite. I look at it the following way. Consider two periods: 1975-1998 (warming), and 1998-2013 (hiatus). Play with the percentages of natural variability (assuming warming in the first period and cooling in the second period) and anthropogenic forcing, accounting for the relative lengths of the two periods, and see what percentage breakdown works for both periods. And what the implications are of the hiatus extending another 10 years. You will not get numbers that exceed 75% for anthropogenic.

If I were given a 44% range to work with, I would put the range at 28-72% anthropogenic. My range overlaps with the IPCC in domain 51-70%, but I also allow for numbers below 50%. The main uncertainties seem all in the direction of increasing the contribution from natural variability.

Bottom line: with the growing recognition of the importance of natural variability, it is increasingly difficult to defend the bottom bound of 51% for anthropogenic forcing, hence I find the increase in confidence to ‘extremely likely’ to be unjustified.

UPDATE: N-G has responded to this essay at the end of his post, it definitely clarifies the differences in our reasoning. Without detailing his new responses, here are some quick responses to some of his statements.

The question of relevance to policy makers is how much of the recent warming is natural vs anthropogenic. Arguing that components of anthropogenic forcing are both positive and negative, and natural variability at various times can be positive and negative, is not useful in assessing the relative contribution of humans vs nature to the warming since 1950 (and it didn’t start warming until after 1975).

There is a quibble about the meaning of ‘most’. Wisely, the AR5 uses ‘more than half’ rather than most. N-G argues that this includes up to 100%, and that the AR5 states its best estimate is 100% for the natural contribution. If this were true, why did the AR5 not say ‘virtually all’ rather than more than 50%?

N-G states: If you’re going to impute a range for them, at least have their most likely value somewhere near the midpoint of the range. For the sake of argument, a reasonable range would be 51-135%.

This kind of argument, where anthropogenic contribution is argued to exceed 100%, seems senseless to me, even if you are just looking at the period from 1975-2000 where there is a clear warming trend. It is something that arises from the over sensitivity of models to CO2 forcing combined with oversensitivity to aerosol forcing.

“We really do not know why this stagnation is taking place at the moment. I hardly know a colleague who would deny that it has not got warmer in recent years.” ~Jochem Marotzke, director of the Max Plank Institute for Meteorology

It’s not a silly notion is it, it’s what you get from “cycles”, they come back on themselves.

You can no more assume natural variation over a 60 year period has added warming than you can assume it has caused cooling.

It’s just a kind of wakeup call to skeptics who only assume natural variation can cause warming since 1951. It’s just as plausible that natural variation has had a net COOLING effect since 1951, in which case the anthropocentric contribution to the warming since 1951 must have been > 100%.

Unfortunately Dr Curry seems to overlook the plausibility of that scenario too.

Assuming there is only ONE natural cycle that occurs in 60 year patterns, this is a reasonable assertion. However, complex dynamic systems often (typically) have many cycles of varying orders of magnitude in duration.

As a simple example, there might well be a 60 year “variabilty = zero” cycle AND a 300 year cycle, and a 900 year cycle, and 9000 years. Combine this with uni-directional events (a fresh-water natural dam bursts, solar variability, and 100 other things) and 90% of the variability might be “natural” EVEN IF the 60 year cycle in question is variability = zero.

Now, if you had models that closely predicted climate results over extended periods of time, you can (with increasing confidence) increasingly net out these unknown natural cycles from your equations and attribute the remaining variability to the factors in your model.

However, if your model is preliminary, when you get data that challenges the model, it is good practice to question the model, EVEN if you have good reasons for the factors you programmed into it.

YES, C02 is a logical factor to attribute warming to. Given lack of other data, it is a premiere factor to consider. AND, this is not Sherlock Holmes – just because you rule out the other factors you are aware of DOES NOT mean that whatever is left is the culprit. The guilty party might be someone who never makes an appearance in the story.

As a simple example, there might well be a 60 year “variabilty = zero” cycle AND a 300 year cycle, and a 900 year cycle, and 9000 years.

Or there might be a 60 year cycle, and a semi-independent 42 year cycle, and a semi-independent 88 year cycle, and a host of non-linear responses to local extremes created by interference among those cycles, with various lag times, and a host of non-linear responses to local extremes created by interference among those responses, with their own suite of various lag times, and so on.

Based on how complex systems made up of random collections of interacting factors typically behave, I’d argue the default assumption should be that such things are present.

Who doesn’t have bias? It would be a mistake to believe scientists don’t hold any bias in their area of study, otherwise all scientists would agree on all theories. We know throughout all science in all disciplines this is not the case. Science would be static without scientists holding biased (directional) ideas about theories. Observational evidence supporting the bias of one scientists idea trumps the biased ideas not supported by the observational evidence of another scientist. Our climate experiment is not over yet…. too early to say who’s bias is more correct at this point based on observation.

You can have a cycle superimposed on a warming (or cooling) trend from other sources. The other sources could be GHGs or could be some other unknown(s) that cause the planet to increase and decrease in temperature by several degrees. LIke the various warm periods about 1,000 years apart and the LIA hundreds of years ago. If the planet is warming from a change in whatever caused the LIA, then you can have a cycle superimposed on that. Or there can be several cycles superimposed on each other. The problem is trying to force an understanding on a system with too little data (or that is just too complicated).

P.S. Even if the LIA was more pronounced in the NH, so is today’s warming.

I think what Judith is saying, if there are longer term variations, cyclical or not in the temperature time series (nevermind uncertainty in measurements say greater than 60 years ago) then this statement does not hold for the full series (obviously it holds for the 60-year component).

See Paul_K’s posts on Lucia’s site about unresolved residuals after removing 60-year and shorter series. He finds curvature in the data, which could be a sign of longer cycles (or not).

The blue line does hint at a 100+ year ‘cycle’. The proxy data confirms it is a cycle rather than a ‘linear trend’. Problem is the high quality data is too short currently and the proxy data is so imprecise that I can see almost anything in that.

We may want to know what the future holds Rich but we should not let Western school teachers confuse the ancient science of astrology with real science simply to give gravitas to humanity’s superstition and ignorance.

“If you argue natural variability isn’t cyclic then you lose the argument that natural variability must have contributed warming from 1979-2000 if it caused cooling since 2000.”

If you recon that two samples of an apparent 60 year periodicity is something you can call a sine wave then I think we need to talk about how many samples you need to have ‘certainty’.

The most likely case is that this is quasi periodic function with a period of approximately 60 years. Could this be as low as 54 years, i.e. 3rd harmonic of the Saros cycle. Sure. Could this be a natural oscillator formed by the thermohaline cycle that happens to be ~60 years long. Sure. There are other choices also.

Many, many possible explanations and not enough accurate data to chose between them.

The data says what the data says though. Those cycles in the record are there and will not go away. This is full kernel stuff, not something that will change with later data.

It isn’t only the repetitive nature denoted by the analogy of a sine wave that helps explain climate change but, as brought to us by Nicola Scafetta, it also is, the phenomenon of collective synchronization of coupled oscillators, connoting the role of chance as part of the holistic process — involving even the effect of the big planets, Saturn and Jupiter — on the weather of the Earth.

”Low summer insolation occurs when the tilt of the axis of rotation of the earth is small; the poles are pointing less directly at the sun; the Northern Hemisphere summer solstice is farthest from the sun; and the earth’s orbit is highly eccentric.” The big planets certainly could play a more direct role that you wish to give credit when it comes to changes in the Earth’s orbit or time.

Climate sensitivity of model was 4.2C per doubling of CO2. So half the scenario B projection to fit the observed temperature trend and you get a sensitivity of 2.1C per doubling of CO2. < 1C would undershoot the observed warming.

not necessarily. It could also mean that the transient climate response is much lower than the ECS than normally assumed/calculated. I lean toward this hypothesis but it is only a hypothesis. It’s a way to reconcile with the paleo data, though. It also happens to coincide with “less need for drastic measures in the near term” policy options, which of course makes my assessment, shall we say, motivated.

John Nielsen-Gammon makes a fundamental error with his statement
“If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part.”

Our hostess starts to point this out with her statement
“Even on decadal to multi-decadal timescales, there is additional natural variability not captured by the stadium wave, notably the NAO/AO (which may not be a purely internal mode).”

I have tried to express my thoughts on this before. Noise in any system has two characteristics; magnitude and length of time. Before we can conclude that there is no residual value from the noise, then it is essential that the integration time over which a signal is being detected, is long compared with the period of all the various noises present.

At the current time, no-one, and I mean no-one, has any idea of the magnitude and time periods of all natural noises. But we do know that some noise factors operate over a comparative short time period of 60 years, e.g. the PDO, and other factors operate over time periods of centuries and millenia.

If it is true that some noise operates over much longer time periods than that over which we are trying to detect a CO2 signal, then it is inevitable that there will be residuals in the noise that will be confused with a supposed CO2 signal.

Until we know with far more certainty than we do now, what the magnitudes of all the natural causes of noise are, and the time periods over which they operate, then it is impossible to conclude that their effects are negligible. It is just as logical to assume that CO2 has a negligible effect, and all the variation we see is caused by natural factors, as it is to assume any other split between CO2 signal and natural noise.

I agree with your first 5 paragraphs. Also remember it might not be stationary. Exogenous non-cyclical forcings exist (e.g. big volcanoes), coinciding amplifying feedbacks (volcanoes + sun + ice let’s say). All that says though, is that curve-fitting and global-scale signal processing are only one of several ways to crack the data. Thus GCMs….

“Until we know with far more certainty than we do now, what the magnitudes of all the natural causes of noise are, and the time periods over which they operate, then it is impossible to conclude that their effects are negligible. It is just as logical to assume that CO2 has a negligible effect, and all the variation we see is caused by natural factors, as it is to assume any other split between CO2 signal and natural noise.”

John Nielsen-Gammon makes a fundamental error with his statement
“If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part.”

Agree with you completely.

First of all, who knows whether or not “over 60 years, natural variability averages out to zero”?

It could be “over 240 years” (just to pick a figure) – or even longer. And we have no good record that goes back this far.

But we do have pretty good evidence that there were extended time periods of warmer (and colder) climate than today, long before there was any “man-made climate change”.

Then there is the distinct possibility that natural changes in cloud cover have caused changes in our climate – and we do not know what is causing (or has caused) these changes.

So simply concluding that everything that is not cyclical over 60 years is, by definition, “man-made climate change” is an unsubstantiated leap of faith.

I remember not so long ago being told by the experts that GHG emissions dominated our climate. TSI was constant. Other alleged solar effects had no effect on climate. Natural variability was insignificant compared with GHG effects and increased CO2 emissions would cause significant warming that nothing else could stop.

“This question is asked in a blog post by John Nielsen-Gammon entitled Your logic escapes me. Interestingly, the logic that escapes him is the argument used in my recent Senate testimony. “

A public debate between you two would be just the thing, though I won’t hold dinner waiting. Funny how the people who denigrate debate as a means to demonstrating who has the stronger arguments, are all warmists. I’m sure it’s just coincidence.

The ice cores show that there are both upper and lower bounds to temperature changes. If one takes the measurements over a long enough period of time the natural variability averages out to zero. The only flaw I see in his logic is the one pointed out by Dr. Curry in that he does not take a long enough period of time. However, the natural variability should be decreasing over time. The problem that the AR5 has is that the other reports started out with attributing 0% of the change to natural variability due to other factors that they attributed negative values to. When you start with zero it does not leave any room to decease the value, and they needed to increase the value because of the pause. They did so and now the natural component of warming is reported by them to be 0-50%. Whether that is high enough or not is still open for debate, but the longer we measure temperature the smaller the natural variability should be.

Tony, you write ” Whether that is high enough or not is still open for debate, but the longer we measure temperature the smaller the natural variability should be.”

I don’t often disagree with you, but I think this goes a little too far. You would be correct if you mean time periods measured in millennia. But surely we are talking decades. Whether the residual of the noise is small, large, increasing, or decreasing, depends on the magnitude and time period of the noise. Until we know these values, it is just as likely that noise residuals could increase as they are to decrease, for such short term time periods.

Natural variability happens on scales even longer than ice ages. There’s a huge difference in what dominates land vs. ocean heat budgets. Continental drift changes the climate too on very long time scales and possibly shorter ones for very critical choke points like Drake’s Passage (ask CaptDallas). Presumably the cyclical ice age the earth is experiencing for only the past few million years is caused by continental drift finally crossing some threshhold in land/ocean configuration with respect to latitudinal concentrations and steerage of ocean currents. Glaciers need land to anchor them. Twice as much land being in the northern hemisphere than the southern sets up a borderline state where high albedo land-ice is balanced against low-albedo ocean surface. It doesn’t take much to flip the switch between the two states. Since we are already in the warm state we can’t force a climate flip by making it artificially warmer. The most we might hope to do is prevent it from flipping back to the glacial state. That would probably be a good thing if the Holocene interglacial doesn’t end. Although I must admit the idea of a mile of ice over the top of Washington, D.C. has a certain appeal to it. My property values in sub-tropical southern Texas would presumably rise too as the supply/demand situation for real estate not underneath a glacier changes.

You could be correct on short time scales, especially considering the pause, but even in the pause there are shorter time scales with more variability. I know that this does not prove my assertion (it is up to me to do so with data not the other way around), but can you show me data that disagrees with it?

No I cannot. That is the point. You could be right, but we don’t know if you are right. Until we understand all the details of natural noise, we cannot say for certain what is happening.

My issue with the IPCC is not so much as to whether CAGW is correct or not. It is a very viable hypothesis, What I object to is the certainty with which it has come to it’s conclusions. I have the same objection to your statement. Not whether it is correct or not, but that you should not be certain that you know which way noise is going on a short time basis.

More like ask J. R. Toggweiler, PhD with the GFDL. The Drake Passage/Antarctic sea ice can shift the average Antarctic Circumpolar Current flow rate and distribution causing Southern Annular Mode like changes which can last a century or longer. WHOI also has modeled century scale “pseudo-cyclic” oscillation up to a century. The potential zonal (east-west) impact is on the order of 0.6C per and meridional 3.2C per Brierley et al.

The range of impact and time scales are perfectly consistent with the rate of OHC increase and sea level rise since ~1700 ad. I think Steven Mosher refers to this as the LTP unicorn. I think it is more like a thermodynamic Pitt bull ready to chew some arse.

Gammon’s thinking is all jumbled up. Natural variation equalling zero over 60 years doesn’t rule out longer natural variation. Fercrisakes climate cycles extend at least up to ~100,000 years for glacial/interglacial cycles. From daily cycling of 1000W/m2 on the tropical ocean to monthly tides and seasons and water taking years to slosh back and forth across the Pacific (ENSO) and then harmonics generated by all these things beating out of sync with each other… and he thinks nothing natural can be left if the decadal wiggles happen to mostly cancel out over a period of 60 years that the remainder must be anthropogenic? Amazing.

natural variation integrating to 0 over 60 years, doesnt rule out you being a moron.

yes, LTP is a possibility. But one does get to make assumptions.

Assuming no LTP, we can draw the conclusion John NG does

If you want to posit an LTP, then you need to identify it and explain it.

Merely pointing out that LTP is not excluded, is logically equivalent to pointing out that unicorns are not excluded. You have to offer an explanation that fits the facts better, not merely point out that there might be a better explanation that relies on unicorns or some other vaguely defined and non specified process.

Thats the difference between skepticism as a tool for science, and science.

Steven you want to rule out what you don’t know about. That’s called an argument from ignorance. The logical fallacy is all yours I’m afraid. Unicorns are a poor comparison. We know for a fact cyclical natural variation happens at intervals at least up to 100,000 years thus cycles longer than 60 years are a proven entity. Unicorns are not. You don’t have the chops to argue logic with me. Don’t try.

A climate oscillation or climate cycle is any recurring cyclical oscillation within global or regional climate, and is a type of climate pattern. These fluctuations in atmospheric temperature, sea surface temperature, precipitation or other parameters can be quasi-periodic, often occurring on inter-annual, multi-annual, decadal, multidecadal, century-wide, millennial or longer timescales. They are not perfectly periodic and a Fourier analysis of the data does not give a sharp spectrum.

I bolded the parts Steven and Max need to learn. Yes, I know I bolded it all.

Rapid temperature change during the Holocene happens on 1,400 year cycles. Resolution is limited to changes that take place no faster than 300 years. Fluctuations are seen to be 2C at that resolution. There is little basis for belief that current warming is out of the ordinary with these magnitude and duration events happening every thousand years. They may be more frequent and of even greater magnitude than what can be resolved from paleo-data with 300-year averaging.

Steve, imagine that both the Pacific and Atlantic each have a periodic heating/cooling cycle, and that in both cases the length of the periodicity is generated by the position of the land/ocean; so that each ocean has a unique resonance frequency. If, say, the Atlantic has a periodicity of 60 years and the Pacific 54 years, and the max-min global temperature change in each case is 0.5 degrees, then we get a quite complex wave form. When both are in phase he temperature rise is one degree and when out of phase the temperature rise cancels at we have zero.
With just two simple sine waves, with a slightly different decadal frequency, 60 and 54 years, you get periods of warm and cool that appear complex, but are in fact quite simple.

Roman Warm Period to cold period to Medieval Warm Period to Little Ice Age to Now is a natural cycle that averages close to zero.

That same cycle has been in place for ten thousand years.

Somehow, they have declared that cycle ended or does not or never did exist and they have replaced that natural cycle with a ten thousand year long hockey stick. The Consensus Theory and Models follow Michael Mann’s Hockey Stick and they ignore actual data for the past ten thousand years.

The actual data is the Earth Made Stuff.

Climate Model Output is manmade stuff.

CO2 Alarmist Theory is Manmade.

Climate Data is Real. It is real that temperature is inside the same bounds it has been in for ten thousand years.

We do not adequately understand the natural cycles and the extent to which they interact and whether the interactions are just additive or more complex.

We have a poor understanding of clouds, interactions between the atmosphere and high energy particles, the influence of the ever changing solar magnetic field. We do not fully understand the dynamics of the heat pump mechanism that is driven by the energy imbalance between the tropics and the poles.

There is no base line or normality against which we can calibrate and measure human impacts. The climate is forever changing.

Despite all of these uncertainties (and many others) the IPPC selected carbon dioxide as the only significant variable and chose to ignore the rest, apart from regarding aerosols as a convenient brake to apply to fine tune the rate of warming in their models.

The models have failed. It is not just their inability to match the temperature record, they fail to simulate clouds, precipitation and other features of our climate. Cherry picking may find a model that gets something right but that is a meaningless pastime. I believe that their credibility is already lost, so clinging to the message that their creators have every confidence in them is rapidly bringing the science into disrepute.

It is now time to get back to basics and start the process of understanding how the climate works in spite of greenhouse gases.

What I consider most logical seems to really differ from Judith’s thinking. To me the primary question is, how strong is AGW, the relative strengths of AGW and natural variability is secondary. I interpret the statement of IPCC strictly and only as a statement on the ratio of the human contribution calculated from the estimated TCR and the total observed warming. That ratio might equally well be over 100% than under 100%. I think that the most likely value is close to 100%, and that this ratio is extremely likely over 50%. I would also say that its very likely less than 150%.

As far as I can see both IPCC and John N-G follow the same logic. For me any alternative is strange and against my way of logical thinking.

I get your way of thinking Pekka. If doubling CO2 adds 5C to the global average temp eventually, then we have a problem unless we get lucky and natural variability goes significantly in the other direction. So we are back where we started. What is the climate’s sensitivity to increasing CO2? Do the models have it right?

That’s really the issue. The estimate of TCR that agrees with warming since 1951 seems to be somewhat lower than what most models would predict. Thus I see some evidence that most models go too hot, some just a little some a bit more.

At present the evidence is not strong. My upper limit for TCR (or 150% for the ratio) is high enough to include largely also the model predictions. It’s only more likely than not that most models are running high.

As we know that the models have other weaknesses this is not serious for the models, it just tells that we should prefer empirical data over models that are known to have weaknesses, when the amount of empirical data has reached the present level.

To say that anthro contribution to observed warming cannot be >100% does not display that enough effort is being put into understanding the IPCC position. Clearly, Fig. 10.5 in AR5 Fig 1 shows mean GHG+OA exceeding mean observed warming. And that natural as well as internal variability (stadium waves etc) average to 0 over this time period. So, why not try to understand why anthro contribution can exceed 100%?

This kind of argument, where anthropogenic contribution is argued to exceed 100%, seems senseless to me, even if you are just looking at the period from 1975-2000 where there is a clear warming trend.

IT seems John N-G has made this point multiple times, so one has to look at the 1951-2010 period, not any other cherry-picked interval such as 1975-2000.

When we interpret what IPCC is writing we must remember it’s task. IPCC is not set up to promote climate science in general, but to assess AGW and risks of that.

For the climate science without any special emphasis on AGW, the question of attribution is at the core in interpreting observations. For the task of IPCC strength of AGW is the issue. IPCC must consider natural variability in order to assess properly AGW, but the issue is looked at from a different direction. Therefore the natural way of formulating the conclusion is different, but it must be based on the same scientific knowledge.

Pekka, I agree. My question was to Prof. Curry – if we only had GHG and OA hypothetically contributing over a period where we observed warming, how would we attribute to various contributors? Curry claims this is an issue of forcing vs attribution. I don’t understand the difference. In a hypothetical case of GHG and aerosols only, one might have for example said 120% due to warming and -20% from cooling sources. Curry somehow seems to be of the impression that you will always have a bunch of positive contributors each less than 100% which all add up to 100%, the mechanics of which are beyond me.

Judith’s range implies more than 100% confidence that natural variation has been positive since 1950. More realistic ranges would have the anthropogenic contribution surrounding 100% to allow for a negative natural variation since 1950. Realistically the non-anthropogenic part is close to zero when averaged over 60 years and the sign is far from certain. In fact with the sun declining from a mid-century peak, a case can be made for a net negative natural variation over this period.

As long as temperature stays inside the same bounds that it has been inside of for ten thousand years, you cannot rule out that natural variability is still most likely taking care of all or most of the climate change.

We are where we should be if you look at data for the past ten thousand years and project the same cycles forward.

Whether or not natural cycle variability like the PDO averages to zero, the GCM models were provably (AR5 SPM figure 1-4) tuned using only the upward trading otmhalf of the cycle, so must run hot. That is the essence of Asafoku’s curve fitting argument. From which it follows that Nielson-Gammon’s critique fails using its own logic because it ignores GCM failure to incorporate at least this one full cycle in their parameterization tuning.

Judy, black carbon perhaps should be part of the discussion. A couple of years ago, there was a spate of articles saying that BC was more important than previously thought (ever heard that before?), and was more important than methane. In your view, was this increase in the warming potential of BC overdone? Or should it be part of this conversation?

I reproduced a graph and a few quotes from a paper Hansen wrote. I reproduced graphs and quotes from a number of other sources too. Then I showed where the IPCC ignored Hansen’s conclusions about black carbon and speculated it was because their mission was to make CO2 and hence the United States (who was then the single largest carbon emitter) into the bad guys. I pointed out that Europe, Africa, and Asia (especially poorer nations that practice slash/burn agriculture and heat homes with biomass and use antique smoke belching diesels in transportation) were the primary emitters of black carbon and near enough the Arctic to accelerate snowmelt by lowering its albedo which, at the time, Hansen had also discovered. Hansen never followed up I suspect he was taken aside and told what sorts of articles would best further his career at NASA.

One thing we do know about our climate is that it is remarkably stable. It has maintained a fairly constant temperature within a narrow range for thousands of years allowing life to flourish. This suggests negative feedbacks. It suggests thermostatic control. I suspect much of this is due to the versatility of water with two possible phase transitions and its ability to transport energy around the planet in the oceans and in the atmosphere.

We should be identifying the thermostat mechanisms. That is the key to understanding our climate.

The strongest and most important negative feedback that creates the ‘thermostat’ for the climate is the fact that the emission of energy is proportional to the absolute temperature raised to the power of 4.

Along with the relative stability of the energy source.

It requires a strong forcing factor to shift this stable interaction between energy in and temperature stabilising to emit a matching amount of energy.

“One thing we do know about our climate is that it is remarkably stable. It has maintained a fairly constant temperature within a narrow range for thousands of years allowing life to flourish. This suggests negative feedbacks. It suggests thermostatic control.”

Even a 10C sensitivity climate with massive positive feedbacks would maintain “a fairly constant temperature within a narrow range for thousands of years” if changes in forcing were small. All the evidence shows the forcing HAS been small.

Lolwot Wrote:
“One thing we do know about our climate is that it is remarkably stable. It has maintained a fairly constant temperature within a narrow range for thousands of years allowing life to flourish. This suggests negative feedbacks. It suggests thermostatic control.”

YES!!!!!! The Temperature Regulation of Earth got better as the Polar Ice Cycles Developed.
It snows more when it is warm and that cools Earth.
It snows less when Earth is cold and that lets the sun warm the Earth.

The Troposphere? hot at the bottom ,cold at the top , what does hot air do? rises! what does cold air do? yup , spot on , a simple mechanical process that’s kept humans in the goldilocks range for millennia.

The problem with evoking very long time-scale natural variation of >100yrs cycle length is that there is very little evidence that such cycles could cause the sort of rapid, decadel, rate of temperature change or speed of sea level rise.
The paleoclimate evidence shows no such rapid, large-magnitude {>1degC or >3mm/yr} fast change as a natural cycle. such data is associated with significant forcings, solar and/or CO2.

It is also possible to argue that natural variation does NOT have to be neutral over a specified cycle length. That cycles could ratchet the climate and generate a sea level / temperature trend without a forcing factor.

But here’s the thing. If, over 60 years, natural variability averages out to zero…

But it clearly doesn’t, does it?

What about the previous 60 years warming, and the 60 years previous to that?

What about the recovery from the Little Ice Age that occurred previous to alledged anthropogenic influence? That clearly didn’t average out to zero, or else we’d still be having ice fairs on the Thames.

The more desperate they get, the more ridiculous their explanations become.

@- catweazle666
“What about the recovery from the Little Ice Age that occurred previous to alledged anthropogenic influence? That clearly didn’t average out to zero, or else we’d still be having ice fairs on the Thames.”

The LIA is usually attributed to a change in climate forcings, a reduction in solar output and an increased frequency of big volcanic eruptions.
Not a natural cycle.

“The LIA is usually attributed to a change in climate forcings, a reduction in solar output and an increased frequency of big volcanic eruptions.
Not a natural cycle.”

But wait, Max, you’ve substituted the word cycle for “variability.” Surely, the things you mention are examples of natural variability, which it is now claimed averages out to near zero in 60 years. That assertion seems deeply suspect to me…

I can’t speak for Izen, but there are certainly natural factors which can affect climate in the short term and which are essentially random in nature rather than cyclic – ENSO, volcanoes and the sun are examples. In the longer term we would expect the overall effect of these to be pretty flat but we can’t say they follow any particular cycle (apart from the 11 year solar cycle).

So I do think it’s fair to say that even if there is a 60 year “natural” cycle one can’t simply assume that over any given 60 year period all “natural” factors will net to zero. But in making that argument John N-G is only following Judith’s own logic – Judith is arguing that if current warming is being masked by cyclical natural variation, and that the start of the “pause” coincided with a phase change in this cycle, then it’s reasonable to assume that there was a natural component to the warming prior to that point. But what if the “pause” is partly due to the kind of non-cyclic factors I mentioned above? Surely Judith’s argument is weakened somewhat.

Nobody is saying that the net variation over 60 years cannot deviate from zero, the point is that it could equally well deviate to either direction. Thus this possibility does not affect the most likely value of anthropogenic contribution, it affects only the uncertainty. If we would assume that that there’s no net natural variability, we would have a very precise estimate for TCR, the uncertainty would not be 50% from the average as it is according to IPCC and as it required to make John N-G’s argument valid.

50% uncertainty is well enough to make the likelihood of exceeding that less than 5%. That’s the point, not any stronger assumption of no net natural variability.

There’s some evidence that a particularly strong component of internal variability has a 60 year cycle. That particular component is essentially zero over the period. That makes the best estimate of remaining variability lower than it would be over some other period like 30 years.

Thus there’s some reason to expect that natural variability has less effect over 60 years than over other intervals. A related point is that some arguments could be presented for a particular sign and size of the natural component over some other intervals, but such arguments seem to be missing or weak over this interval.

JCH, “What you won’t see is the AMO, the PDO, the AO, etc. They are cycles.”

They are assumed to be cyclic and are based on a less than optimal metric, “surface” air temperature which has a margin of error of +/- 0.25 C or greater prior to 1950. You can confidently say you may or may not have an ~60 year “cycle” that is approaching zero. If you start at 1920 and use regions with more complete data you can detect long term responses that are likely not cyclic which may or may not be related to AGW.

If you separate land surface into Tmax and Tmin, you can find different “cycle” periods. Tmin for example shows less of the 1910 to 1940 AMO “cycle” than Tmax. Tmax shows less of the 1976 to 2000 AGW amplified AMO “cycle” than Tmin. You can “find” about anything your pity pat heart desires.

Ocean basin SST versus GISS and the Indo-Pacific Warm Pool. The “cycle” in the IPWP time series has that weakly damped response character kind of like the oceans need better shock absorbers. Note how GISS starts diverging at the end.

I’m going to repeat a comment I made on the previous thread just because it fits in so well with this topic. External forcing or the distribution of heat? What matters more? According to this model and as far as the North Atlantic goes, it is the distribution that matters.

The skill of the DP is thus tied to correct initialization of ocean circulation anomalies,while external forcing is found to contribute negligibly (and for incorrect reasons) to predictive skill in this region over this time period.”

Sure, just remove the previous 50’s-70’s aerosol fudge (for which there was no data), insert the ENSO multi-decadal phase cycle but minimize its effects for any hindcast period but use it as a fudge for totally whiffing on your forecasts, and add a new deep ocean heat content fudge (for which there is not compelling corroborating data) and voila: climate science!

The problem is that most of the thinking is locked into an obsession about radiative forcing and radiative imbalance. Everything is seen through that prism resulting in a paralysis of conceptual understanding and healthy curiosity.

Furthermore, the science deliberately shuts down any thinking outside of the GHG box. That is why there has been little progress in understanding our climate for decades in spite of the taxpayer throwing billions at it.

Yes, it’s a particularly misplaced obsession to focus on radiative physics at the surface/atmosphere interface because latent heat is the main cargo hauler at the surface by a huge margin. I think part of the reason for radiative focus is that radiative is easy to numerically model from first principles and the water cycle is so difficult it must be guesstimated (parameterized) instead.

Izen

I’ve said essentially the same thing: in politics follow the money, in physics follow the Joules. Joules sounds like jewels which makes for a nice double entendre.

The only way to get a good number would be if 2020 was an outstanding El Nino year, otherwise your number would be biased low. You would also be excluding a good number of years with temperature data and low CO2.

“But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part. Thus the IPCC can and should consider it to be extremely likely that human influence dominates the net rise in temperature over the past 60 years.”
What he is totally missing is the possibility that the left-over man-made portion can become so small doesn’t matter much (and can at some point be too small to worry about). Also ignoring the possibility of longer-term natural trends. Also ignoring the possibility of fluctuating frequencies and amplitudes of the known multi-decadal natural cycle, the differences of which can be mistaken for left-over man-made portions.

It seems what some AGW skeptics are missing is that the portion of anthropogenic influence on the character, nature, timing, etc. of “natural” cycles could be changing (growing) over time. What this potentially means is that as far as surface tropospheric temperatures– we could expect a very strong period of warming in the next few decades, and the linear trend in tropospheric temperatures returns to a slight upward trend, and balances out the flatline of the period of the “pause” since 1998.

It seems what some warmists are missing is that the portion of anthropogenic influence on the character, nature, timing, etc. of “natural” cycles could be contributing less and less to the change over time. That is what a doubling means. For each doubling of CO2 you get an amount of change in the temperature. As the concentration of CO2 rises and the temperature remains relatively flat, the chance that this doubling factor is very large gets smaller and smaller.

It really depends where we actually are in terms of sensitivity. But the other thing is an actual change of state in the system forced by the continual increase in GH gases. This is the basis of the notion behind the Anthropocene. Thus, without humans the current interglacial would have a different character and pulse of natural variability.

@ TonyB, “However, the natural variability should be decreasing over time. “

Of course, this is the same sort of thing one sees in poker, in which skill (the signal) emerges over time against a background of short term variance (the “noise” of luck). That much is obvious. However, as any experienced, winning poker player can attest, the “short term” is often much longer than most people think.

Like all analogies not perfect, most notably in this case because climate cycles are not random. However there likely are random….or in a practical, forecasting sense….near random components. I’m betting on nothing more than an intuitive basis, that “natural variability” does not average out to near zero in 60 years.

The reasoning found in N-G’s statements quoted here takes us back to kindergarten. It is truly difficult to pack one paragraph with all the errors found in N-G’s paragraph that follows:

“But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part. Thus the IPCC can and should consider it to be extremely likely that human influence dominates the net rise in temperature over the past 60 years.”

Well, N-G, if we assume that natural variability averages to zero over time periods that we can designate then it follows that the contribution of manmade CO2 can be determined. Got it. But that raises quite a few questions.

Why would anyone who calls himself a scientist make the harshly strong assumption that natural variability averages to zero over any time period? There are only three possible answers. One reason is that you are a fan of circular argument and you want to build your conclusion into your premises, which you do by assuming that nature works like a dishwasher having washing and drying cycles that balance one another exactly. Given such nonsense, the contribution of manmade CO2 is precisely detectable.

The second reason is that you are so averse to getting outdoors and measuring stuff that you insist that stuff comes in cycles that balance one another. In other words, you are so anti-empirical that you refuse to measure what Mother Nature is actually capable of.

The third reason is that you are a modeler. You have to get Mother Nature into your computer somehow. The only way to do that is to make many harsh assumptions about Mother’s behavior. Your assumptions are as harsh as a strait-jacket. Mother has to do everything in cycles that balance to zero in time periods that you can designate. What nonsense.

Dr. Curry’s Stadium Wave hypothesis avoids all the self-defeating assumptions made by anti-empiricists and modelers. Dr. Curry’s hypothesis demands empirical work to determine how Mother’s “cycles” are manifested this time and how they have been manifested in the past.

The degree to which Alarmists have no imagination for Mother Nature astonishes me daily.

“Why would anyone who calls himself a scientist make the harshly strong assumption that natural variability averages to zero over any time period? ”

1. On one interpretation the physics argues that engery out and energy in must balance. Otherwise internal variations ( natural cycles) would create
energy ex nihilo. Last time I looked PDO was not God

2. There is idevidence of cycles 30 years and shorter. hence 60 years is a good first guess at a time period over which cycles should integrate to 0.

Now, there may be longer cycles. there may be big foot and unicorns. But we generally start by explaining what we observe rather than speculating about what might be observed.

The best line of attack against the integrate to zero over 60 years
is to find cycles that are greater than 30 years.

Next, you have to show that those processes explain observations better.

Hey Mosh it’s pretty evident climate boffins screwed the pooch by not recognizing a 60 year cycle. So you want to say okay but the cycles we’re missing stopped there. For sure. We can’t make that mistake again.

“Why would anyone who calls himself a scientist make the harshly strong assumption that natural variability averages to zero over any time period?”

Because Curry’s argument hinges on there being an approximately 60 year cycle. The argument that Dr Curry and others make is that the post 2000 slowdown in warming implies a natural cycle which contributed to the pre-2000 warming and therefore the proportion of IPCC attributed anthropogenic warming pre-2000 is lowered.

He’s showing you what follows from that being true. He isn’t stating it IS true.

If you don’t make the assumption that natural variation works in a 60 year cycle, then you can’t argue the 1979-2000 period was part of a natural warming cycle. It might have been part of a natural cooling cycle instead. You are still then in a net-zero situation. All you have is error bars to play with, and error bars work both ways.

Thank you for expressing this so clearly. I think JC’s logic is exactly right. I kept thinking that there must be something more complicated that is going on. I dont know how anyone can think in such short terms and ignore other kinds of trends that we may be a part of now. Sensitivity is a function of time and CO2 concentration. If either changes then the sensitivity by definition changes and if prior models have proven wrong by observation something has to give.

you dont prove “it” is there by posting graphs. that proves you can make a chart, nothing more.

1. you first have to accept the record you are using.
A) is that record reliable?
B) does that record make any physical sense.. ie does it
make physical sense to talk about an average of air temps
and SST.
2. Then you have to identify a test to extract the cycle. using your eyes is not a test. aim your explanation at a blind mathematician.

3. Then you have to determine whether that “cycle” is “real” in short
do a statistical test.

4. Then you have to see if that test is robust to changes in your assumptions.

A) do we see it in SAT only
B) do we see it in SST only
c) do we see it in SAT & MAT
d) what do other tests say?

in short, we need to ask ourself the question, is the global temperature index (SST + SAT) the only metric we should look at for diagnosing
the climate as a whole. It would be weird if such a lousy metric contained ineteresting information about the governing dynamics.

I’ve yet to see anybody document this “cycle” in the way I describe.

In cases where they attempt this the authors refuse to disclose their data and code.

“If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part.”

If over 60 years, natural variability averages out to zero, why has there been warming in reported temps since the little ice age? It seems that this post-modern logic hinges entirely on the flat bade of the hockey stick. No wonder they defend Mannian statistics to the death.

Judith,
Thanks for your update on the N-G paper.
The big controversy is setting all natural cycles to a net zero. No one but him has ever said that. It is a grand assumption that allows motivated reasoning to derive a pre determined conclusion.

It is so hard to imagine the mind frame that makes that grand assumption at the start without massive data analysis.

Judith Neither you nor John ever seem to look beyond decadal variabilities. It seems, to me reasonably likely ,that 20th century warming trend about which the PDO oscillations vary is simply due to the Millennial cycle peaking. For estimates of the timing and amount of the coming cooling based on the simple assumption that the recent peak simply reflects synchronous peaks in the 60 and 1000 year cycles see:http://climatesense-norpg.blogspot.com
Certainly this is the simplest and most obvious working hypothesis- and should be the first to be explored.
If this is correct then there is no necessary reason to attribute any warming to anthropogenic causes- certainly the anthropogenic effect can”t be estimated at all before the natural\l variability is fairly well pinned down.

The IPCC’s confidence in its attribution statement may be magically increasing. But even the Eurocrats (perhaps in the face of the coming European Parliamentary election), are hedging their bets.

From that notoriously conservative rag, the N.Y. Times:

“Europe, Facing Economic Pain, May Ease Climate Rules

High energy costs, declining industrial competitiveness and a recognition that the economy is unlikely to rebound strongly any time soon are leading policy makers to begin easing up in their drive for more aggressive climate regulation.”

f we are ever more certain of the coming thermageddon, why are these geniuses putting their political survival ahead of the welfare of the whole planet? Have the Koch brothers bought the EU bureautocracy?

lolwot, The trend was going down after a peak cycle in 1940 but there were much more powerful cycles throughout the 20th century as compared to the two previous centuries. Three of the cycles for the 80s 90s and 00s were nearly as strong as the big one. That’s showing just sun spots as well see my first chart for geomagnetic activity. I like the perspective of your chart though thanks for showing it.

“Max_OK
Do the woodfortrees plot for yourself and you’ll see that… ”
________

Max_CH what I see using OLS is the early 20th Century rise in sunspot numbers peaked around 1958 and declined thereafter. The decline is consistent with John Nielsen-Gammon’s statement: “Over the past 60 years, natural forcings (sun, volcanoes) have also had a cooling effect.”

Plotting 3 cycles at a time shows the end of the century to have as strong as a cycle as the middle of the century and much higher than the beginning of the century. It also shows the middle and late 20th century cycles to be much higher than the previous 4 cycles going back to 1810:

“Plotting 3 cycles at a time shows the end of the century to have as strong as a cycle as the middle of the century and much higher than the beginning of the century. It also shows the middle and late 20th century cycles to be much higher than the previous 4 cycles going back to 1810:”

“This contradicts what John Nielsen-Gammon said!”

_______

No, it doesn’t. You are talking about the amount. Nielsen-Gammon was talking about the change in amount.

Until recently it was thought that there were 28 cycles in the 309 years between 1699 and 2008, giving an average length of 11.04 years, but recent research has showed that the longest of these (1784–1799) seems actually to have been two cycles,[1][2] so that the average length is only around 10.66 years. Cycles as short as 9 years and as long as 14 years have been observed, and in the double cycle of 1784-1799 one of the two component cycles had to be less than 8 years in length. Significant variations in amplitude also occur. Solar maximum and solar minimum refer respectively to epochs of maximum and minimum sunspot counts. Individual sunspot cycles are partitioned from one minimum to the next.

^^^^ …. just for clarification on the dates I used

So thanks for clearing that up for me and I’ll take John Nielsen-Gammon off my —- list. I still think he is using trend to argue cooling vs warming and it is not correct but I’ll let him delude himself since I’m just and idiot blog poster.

Variability goes both ways, otherwise it would not be called variability. The question is, do we have any reason to think that the net natural variability since 1951 would be more likely positive than negative. I cannot see any reason for such a conclusion, net zero seems a good and balanced guess.

We are back with the problem of inference. Some people talk about Bayesian inference and choosing the prior, some misuse the expression of “null hypothesis” to mean something similar. (Statisticians have a completely different meaning for null hypothesis.)

Looking for cyclic variability on time scales of the available data gives indication for a 60 y cycle that would be in the same phase and thus gives net zero. Nothing else that can be identified seems to tell of other significant natural component. What’s left after, when 60 y cycles are removed fits well with expectations of AGW presented before significant warming had occurred, and which agree also with present theoretical understanding.

That’s the basis for deciding what’s most likely, and gives zero for net natural variability.

Other people base their priors on other arguments. Their inference may result in non-zero best estimate for net natural variability.

The “null hypothesis” is that climate changes naturally (it has over the geological history of our planet).

The “uncertainty” arises in trying to identify and quantify all the many natural factors (forcings, cyclical variability, etc.).

This leads to even more “uncertainty” when trying to establish whether or not human factors have resulted in any warming and, if so, how much.

Arguments, such as: “we don’t know if total natural factors contributed to warming or to cooling, therefore we do not know whether or not human factors contributed to over 100% of the observed warming” are ludicrous as they are purely hypothetical.

The fact is, WE DO NOT KNOW what has caused the observed warming.

To say we are 95% certain that “most” of the warming (since an arbitrarily-picked 1950) were caused by increased human GHG concentrations is simply a model-based “leap of faith”, but has nothing to do with reality.

CAGW proponents (such as IPCC) need to falsify the “null hypothesis” before making silly claims based on wild assumptions.

Max_CH, I recommend you follow convention and posit a hypothesis rather than a null. It also would be helpful if you made your hypothesis very specific. Your hypothesis and the corresponding null could be as follows

Hypothesis: All global warming since 1950 is a result of natural factors.

Null hypothesis: Not all global warming since 1950 is a result of natural factors.

Max_CH, I had a snappy response to your refusal to take a discussion of hypothesis testing seriously, but for reasons I can’t imagine it was rejected. Rather than edit and try again, I’ll just say if you can’t be serious about science, I am not going to take you seriously.

You stated: the “null hypothesis” is that climate changes occur naturally. And you said: A tough “null hypothesis” to falsify.

Your null hypothesis is unscientific because (1) it’s not statistically testable, (2) it and the corresponding hypothesis are not mutually exclusive, and (3) it is not specific (e.g.,all warming , 50% of warning ?)

If you wanted to be silly, you could say the null as you have stated it, can’t be rejected through statistical testing because it can’t be statistically tested. But you said you wanted to be serious, not silly.

1. How to “falsify” the CO2 control knob “null hypothesis” (upon which the whole CAGW ballyhoo is built): continue to pump gigatons of CO2 into the atmosphere and watch as the global temperature barely changes, continuing on a long-term path as we have seen long before human CO2 emissions were significant..

2. How to “falsify” the climate changes are essentially natural “null hypothesis” (to which many CAGW skeptics subscribe): continue to pump gigatons of CO2 into the atmosphere and watch the planet warm, as a result, by several degrees as predicted by the models cited by IPCC.

The experiment is running, Okie. Right now it looks like number 1 above is falling into place. But who knows what the future will bring? Not you. Not me. Not James E. Hansen. And certainly not the IPCC.

The reason why natural cycles are assumed to be neutral in their effect on the climate over the full cycle length is that during the cooling phase the outgoing emissivity will drop so that there is a TOA energy imbalance with more energy entering than leaving.
During a warming phase outgoing emissivity will increase because it ia proportional to the 4th power of temperature and there will be a TOA energy imbalance with more energy leaving than incoming.

So the inherent thermodynamics define natural variation as neutral in energy terms. It requires a change in a forcing, solar output or GHG concentration or albedo to alter the climate with a change in energy content.

Your theoretical explanation sounds logical, but you fail to mention that there could be many superimposed cyclical factors of different lengths, possibly lasting centuries or even millennia, rather than just decades.

So the inherent thermodynamics define natural variation as neutral in energy terms. It requires a change in a forcing, solar output or GHG concentration or albedo to alter the climate with a change in energy content.

There are only two external forces in the climate complex,solar and gravitational energy (of which neither is constant), the other variables are solely an increase/decrease in dissipation.

If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part. Thus the IPCC can and should consider it to be extremely likely that human influence dominates the net rise in temperature over the past 60 years.

“If, however, the natural cycles are much longer than 60 years – for example centuries or even millennia, then IPCC must recognize that essentially all of the observed recent warming could be attributed to natural factors, with an extremely uncertain human influence.”

This is better than what distinguishes Judy’s position fron NG’s position:

One error of fact (the IPCC’s best estimate for anthropogenic contribution is ~100%, not <95%), one error of logic (double-counting evidence that the IPCC has already included), and one case of the answer being different because the question is different (1951-2010 vs. 1975-2013).

JC said: “It is something that arises from the over sensitivity of models to CO2 forcing combined with oversensitivity to aerosol forcing”.
No JC, it is not from that. I sent to her an explanation about this “arising” problem about anthropogenic attribution. My pdf was called: “Anthropogenic attribution: the Monte Carlo confusion”, and it would be cool if she post my pdf in her blog (or at least the main ideas within there). Then everybody could discuss this IPCC mathematical error.

What bothers me most about the IPCC statements is the use of “95% confidence”. This makes it sound like the result of a drug trial or some such study where the statistics are clearly defined and agreed upon. I have a hard believing there was some caclulation that led to “95% confidence”. It would be much more honest to say “consensus opinion” or similar.

I agree. These IPCC “confidence assessments” are worse than meaningless: they give a false glitter of technical authority to something that should be modestly expressed as “in our opinion” and nothing more.

Our models indicate that over half of the warming observed since 1950 could theoretically be attributed to the increase in human GHG concentrations. Our models are, however, uncertain of what caused the statistically indistinguishable warming of the early 20thC, before there were any significant human GHG emissions. They are also not able to identify the reasons for the mid-century cooling cycle, when CO2 emissions had begun to increase significantly. As a result of these uncertainties, we are not able to give much confidence to the models when it comes to the attribution of late 20th century warming to human influences. More work is needed.

lolwot, to give them the benefit of the doubt there is something to be said for subjective probabilistic assessment by a panel of experts. But a true subjectivist (“Bayesian”) shuns the word “confidence” and uses the term “probability” instead. This is a nitpick but a revealing one being that this muddled language comes from science experts who should know a least a little about this. But that said, a good Bayesian is expected to update prior probabilities with new evidence using Bayes’ Rule. Here we run into a little problem when the posterior probabilities show stronger affirmation of anthropogenic forcings but the updating evidence seems to point in the opposite direction. I’m not aware of how an application of Bayes’ Rule can produce an outcome like that. A Bayesian would say that such an uncertainty assessment scheme is incoherent. Perhaps I extrapolate too much and the IPCC never meant to be acting like good Bayesians. But they don’t act like good frequentists either. Maybe they just don’t care and they meant to use “confidence” as nothing more than a rhetorical device. Which is all that I take it to be.

“Apples with apples” comparison between the statistically indistinguishable early 20thC and late 20thC warming cycles: 1910-1940 and 1970-2000, using HadCRUT4, Mauna Loa after 1958 and Siegenthaler et al. ice core data prior to 1958.

Nicely done Max (the smart one). Makes a lot more sense than the definitional dodge that 60 year climate cycles are…hey..cycles. What goes around comes around, they end up right back where they started, at zero. No natural variability there, unless we need it as an excuse for the pause.

It is simpler. Take the lower end of the IPCC sensitivity, 1.5 C per doubling. Take the change in CO2 since 1950. Compute that this accounts for about 0.5 C of the warming. This is about 75% of the actual warming. Therefore even low sensitivity quite confidently allows CO2 to be 75% of the warming. Given their 95% sensitivity range confidence, you can see mathematically how this translates to their attribution confidence for the warming since 1950.

My preference is to proceed trough the estimate of transient climate response (TCR). IPCC states on that that TCR is likely in the range of 1.0 to 2.5 C. To reach the level of very likely we must extend that range. Noting that values greater than 3 C are judged extremely unlikely the very likely range is perhaps 0.8 to 2.8 C.

CO2 concentration has increased from about 310 ppm in 1951 to about 395 ppm in 2012 or 0.4 %/year. The total increase is 35% of doubling when calculated using logarithms. Based on the estimates of TCR warming from additional CO2 is very likely in the range of 0.28 to 1.0 C with the best estimate near 0.6 C. The actual warming over this period has been about 0.6 C.

Based on this calculation the conclusion of AR5 is (just barely but still) justified based on CO2 alone. Estimates of the other components of human influence are such that they do not essentially change the conclusion.

Natural factors have caused our planet’s climate to change over its entire geological history. It is the “null hypothesis” of climate change.

Some of these factors are known, but more are arguably still unknown.

To AS-U-ME that these same natural factors that have shaped our climate over the past were not primarily responsible for the changes observed during the 60+ year “blip” since 1950 is going out on a limb.

Max (the real one),
That is also plausible and it makes more sense than Pekka’s version of what is logical. We caculate that expected value and expected return stuff in investment analysis and capital budgeting. We don’t really expect the value to turn out to be what we estimated. What are the odds that any given alleged 60 year climate cycle will average out exactly to a net zero?

No Pekka. It doesn’t matter (for a Bayesian) whether the prediction was made before or after the data–that’s one of the selling points the Bayesians use to justify their method. You simply have a mathematical relationship between the prior, the data, and the posterior. Taking any two as given you can derive the third. We can always turn the question around to ask “How strong would your prior have to be to get this posterior, given the data?”

One problem I have with these uber-high sensitivity estimates with postulated natural cooling being counterbalanced by ACO2 is the implication that absent human emissions we would have been plunged into a near Ice-age. It also seems odd to suppose that similar periods of warming pre- and post-ACO2 do not have similar causes.

The skeptics do need to challenge Judith’s assumption that natural variability has been positive, and not just slightly, but accounting for 28% of the warming as a confident minimum, since 1950. Where does this confidence come from suddenly, or is it just a flaw in the logic?

But Max (not you maxie, the rational one) the 1998 El Nino was unprecedentedly strong because of energy added by…AGW. And of course you will recall that subsequently El Ninos got even stronger and more frequent, due to…AGW. So, there has been no warming pause. As evidenced by the stronger and more frequent El Ninos.

Setting the question and choosing the prior are essential in Bayesian analysis. These choice must not depend on the data used in the analysis itself. Therefore the nature of the demonstrably prior knowledge is very important.

In this case the question is chosen as determining a parameter that describes the strength of AGW. That would not be a proper question, if AGW would be inferred from the data, but it is, because it was best existing scientific thinking before warming really started.

The hiatus is real enough to influence the conclusions to some extent. Relative to the overall evidence on the basics of AGW the hiatus leads only to minor corrections in some details. One of this corrections is a small reduction in the best estimate of TCR. That correction is of the order of 0.1C.

The hiatus has added also evidence on the strength and properties of natural variability, but it seems difficult to transform that empirical data to deeper understanding. Proposals like the Stadium wave are attemts towards such understanding, but it remains to be seen, where those attempts lead. At the present they don’t affect conclusions on TCR.

I’m sorry, Pekka, but simply restating your point of view doesn’t refute the point that Bayesian analysis is specifically touted to minimize the problem of data snooping via model selection using the Bayesian information criterion. But beyond that, you can simply invert the procedure and ask yourself what your prior would have to be in order to believe a particular posterior given the data. That tells you how much your beliefs are really constrained by the data, which is what you actually want to know.

Stern is in Davos beating the drum for an organisation called the New Climate Economy, headed by the former president of Mexico, Felipe Calderón. Its aim is persuade finance ministers rather than simply environment ministers that tackling climate change should be a top priority.
This makes sense. Finance ministers hold the purse strings and dictate economic policy. They are much higher up the political food chain than environment ministers.
So what sort of pitch is Stern making?http://www.theguardian.com/business/economics-blog/2014/jan/23/lord-stern-climate-change-review-davos

“If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero. The warming during part of the cycle is cancelled by cooling during the other part of the cycle. What’s left is the long-term rise caused by man.”
_____
Unfortunately, the faulty logic here is the unwarranted assumption that the nature and character of “natural cycles” would not be affected by the signficant alteration of the GH composition of the atmosphere. It thus become increasingly hard to identify the pure “natural” from the forcing caused by anthropogenic activity. These “natural” cycles do not exist separate from the composition of the atmosphere. There’s a very reasonable chance, based on both physics, models, paleoclimate data, and actual current observations that having the highest GH gases levels in millions of years– some 40% higher than the Holocene average is affecting the “natural cycles”. Thus, just as is the case with the frquency and nature of extreme weather events, the frequency and nature of “natural” cycles, whether they be ocean, atmosphere, or combinations thereof are very likely sure to be affected by the composition of the system they are cycling through.

Some “natural” cycles or idecies or measurements of such, that may be altered by anthropogenic GH warming include: MJO, ENSO, AO, PDO, AMO, NAO etc.

The biggest problem appears to be the misuse of the word ‘confidence’, as if it was mathematically derived from the data. In this context, the disputed statement carries no more weight than a show of hands among like minded individuals. A group-think vote in an ad hoc organization.

I been having fun on the Nature ‘Missing Heat’ thread about just this point. Nate has got his knickers in a twist about how unfair it is for me to update Hansen’s graphic to the current day and suggest that ‘constant forcing’ can be either no CO2 rise of a very low sensitivity.

Nate currently thinks that a ‘running mean’ isn’t a filter! I’ll have to remember that one for any future conversations with him :-)

Extrapolate the current trend of CO2 increase and trend of temperature increase last 16 years and you dont get that. I posit the last 16 year trend will last for another 100 years. Changes things a little bit.

Yes, it is true and the denial-o-sphere is revving up to futilely dispute this finding.

The rate of CO2 buildup has absolutely nothing to do with TCR, as it is the log of CO2 that matters, just as Hansen and the other climate scientists have been saying all along.

Whether Sceanrio C is correct is irrelevant right now, as all one has to do is look up the current value of CO2 and reference it to what is was a number of years ago. Then plug in the value of TCR sensitivity and there you go. the global temperature estimate very straightforwardly. For good measure, multiply that value by 1.5 to get the land temperature anomaly estimate.

WHT: Scenario C is the dotted line the measured temperatures are following, regardless of how it was created.
”

Again, why be such a C-sudent?
The temperature is following the actual temperature record, not some scenario. Go look up the GISS gistemp temperature record and you will see what it is doing over time.
Then you can compare it to a model that includes factors such as CO2, SOI, LOD, etc and explain all the fluctuations and pauses over time.http://contextearth.com/2014/01/22/projection-training-intervals-for-csalt-model/

Forget CO2 etc. If temps go down over the next 50 years, your CSALT and all other contraptions will go into the dust bin of history. I wonder what has to happen before the rationalizations stop and the actual scientific inquiry begins.

Having talked about what MUST happen if humans emit lots of CO2 (neatly mechanistic), we move on to what MUST happen within a natural cycle (neatly mechanistic). Apparently the natural ups MUST cancel the natural downs etc and you MUST get zilch change on the average within your nice neat cycle. Elegant! If there are any untidy edges making things less elegant you just say the word “volcano”, maybe.

But what if scholars and researchers checked stuff out in a big way and published a lot less (because they don’t know yet, duh)?

mosomoso, one must be brave to accept that the world is constantly changing and uncertain, in ways far from fully understood by humans. Those less brave, who are not prepared to embrace the true nature of existence, must seek to constrain it to a level of simplicity and dependability with which they are comfortable. One way is though religion, but, as we see here, there are others too.

Unfortunately, the faulty logic here is the unwarranted assumption that the nature and character of “natural cycles” would not be affected by the signficant alteration of the GH composition of the atmosphere.

AbstractDetrended fluctuation analysis is used to test the performance of global climate models. We study the temperature data simulated by seven leading models for the greenhouse gas forcing only (GGFO) scenario and test their ability to reproduce the universal scaling (persistence) law found in the real records for four sites on the globe: (i) NewYork, (ii) Brookings, (iii) Tashkent and (iv) Saint Petersburg. We find that the models perform quite differently for the four sites and the data simulated by the models lack the universal persistence found in the observed data. We also compare the scaling behaviour of this scenario with that of the control run where the CO2 concentration is kept constant. Surprisingly, from the scaling point of view, the simple control run performs better than the more sophisticated GGFO scenario. This comparison indicates that the variation of the greenhouse gases affects not only trends but also fluctuations. [ my bold ]

Additionally, the statement holds only for systems that are closed to energy input and energy rejection. There is nothing to say if the system is open to energy exchanges between the combined system and the environment external to the system.

And the temperature of the interacting sub-systems will range between that of the sub-system having the highest to that which has the lowest and the temperatures of the sub-systems at the end of a “cycle” will depend on the degree of interactions between the sub-systems.

Energy, of course is always conserved. The energy content at the end of a cycle for systems closed to exchange with its environment will always be equal to the initial energy. The Earth’s climate systems are not a closed system, and so this statement does not hold. Its distribution among the sub-systems is the issue that determines the temperature of each sub-system. It is also possible that significant energy can be stored as latent energy content in the solid phase of water, again depending of the temperatures of the contents of the sub-systems.

There will always be temporally heterogeneous aspects relative to the Earth’s climate systems. These will always introduce wiggles into the temporal response of the systems. The energy input into, and rejection from, the open Earth’s climate system will also introduce wiggles. Its a stretch to indicate that the nature of the wiggles is such that the effect over a complete “cycle” on the temperatures of the sub-systems will average out.

This old-ish paper from 2001 indicates that the GCMs are likely predicting / projecting sub-system temperatures that are increased above the empirical information.

AbstractWe study trends and temporal correlations in the monthly mean temperature data of Prague and Melbourne derived from four state-of-the-art general circulation models that are currently used in studies of anthropogenic effects on the atmosphere: GFDL-R15-a, CSIRO-Mk2, ECHAM4/OPYC3 and HADCM3. In all models, the atmosphere is coupled to the ocean dynamics. We apply fluctuation analysis, and detrended fluctuation analysis which can systematically overcome nonstationarities in the data, to evaluate the models accordingto their ability to reproduce the proper fluctuations and trends in the past and compare the results with the future prediction.

From the concluding discussions:

We have also obtained similar qualitative behavior for other simulated temperature records. From the trends, one can estimate the warming of the atmosphere in future. Since the trends are almost not visible in the real data and overestimated by the models in the past, it seems possible that the trends are also overestimated for the future projections of the simulations. From this point of view, it is quite possible that the global warming in the next 100 yr will be less pronounced than that is predicted by the models.

This more recent paper from 2012 indicates that the GCMs likely do not reproduce with high fidelity the natural variations in the Earth’s climate systems.

What does that tell us about the contribution to warming over the 1979-2000 period?

Answer: Nothing.

But wait, surely natural variability since 2000 having a cooling effect implie that natural variability must have contributed warming over the 1979-2000 period?

Answer: Why would it imply that? Why would it imply the opposite happening previously? It would only imply that if you were arguing for some kind of cycle centered with a peak around 2000. What evidence do you have that such a thing is more likely than not?

“PDO! PDO!”

Well that’s an approximately 60 year cycle that averages to zero. Now please see John Nielsen-Gammon for the implications of that.

The fact that it is cooling slightly today despite unabated human GHG emissions and CO2 concentrations reaching record levels simply demonstrates that there is something other than AGW that is driving our climate.

That same “something” could also have been driving our climate during the late 20thC warming cycle.

Both the stadium wave and PDO were said to have 60 year cycles explaining deviations from a steady trend. Taking 60 years, we average those out, so what else do the skeptics have when we have ruled those to out? Not much, I that I see. Vaughan Pratt nicely showed what’s left when you remove the long and short cycles, you get an upward curving trend that is surprisingly like the one expected from CO2.

If natural variability is zero, what do you all have left to explain the pause? Warming is supposed to be accelerating. How long can this go on, before you all get discouraged and embarrassed enough to disappear from this blog?

DM, do you support JC’s assertion that 28% of the warming is the absolute minimum possible value (95% confidence) for natural variability since 1950? The sun has been declining, so no help there. Also natural variability can be 0.1 C, and that is the size of the pause at this point. I don’t think it is contradictory to say that natural variability has a standard deviation of 0.1 C and an average value over long periods of zero.

I don’t recall seeing Judith make that assertion. If she did say “absolute minimum” I would not agree. But what has that got to do with what I asked you? You are dancing around in circles trying to have it both ways. You are a denier of natural variability, except when you need it.

DM, her lower bound is 28% of the warming is natural. If she gives bounds to be compared with IPCC, she would probably be implying 95% confidence of being within those bounds.
On the second point, natural variations of 0.1 C do happen, a large example is occurring now with a negative phase. The anomaly went from +0.2 in 1998 to -0.1 in 15 years which was enough to mask a trend of 0.2 C per decade. This can affect individual decadal trends in both directions.

I may have missed something but why is the discussion limited to 1950 to present. In the longer scheme of things, there was a major dip in the early 1900s followed by a monotonic increase in temperature around 1940. Then from the 1940s to 1950 temperatures dropped, significantly, and then wobbled around flat until around 1970s. This period from the 1940s to around 1970 came during a period of rapid rise in mm GHGs, at least I recall the economy was growing sharply and all the factories and homes were burning soft coal in those days. The US during that period was what China has been in the past 10 years. Is it just a convenient starting point to begin the discussion of “steady growth in global mean temperatures” beginning in 1970? Attribution is a funny thing, either you can make it, or you can fake it and try to get others to believe the story. Numbers do not lie.

It’s roughly when the postulated AGW became significant (not that I believe in any of it). We emit now ~10 GtC/year in total CO2. In 1950 it was ~2 GtC/year. In 1900 it was ~1 GtC/year. Total emissions in the last 15 years are greater than for the whole 1750-1900 period (150 years).

What about Ruddiman theory of agriculture, especially rice cultivation, emitting major amounts of CO2 since the last 5,000 years? Anyone have information of estimates of that baseload carbon emission amounts?
Scott

I recall that the Ruddiman hypothesis is that the rice and agriculture plus tree and tall grass prairie clearing have prevented or delayed the coming ice age. Scientific American article about 2005 plus his books. Interesting thoughts but like many assertions little accepted proof. Much like heat hiding in deep ocean and there is no pause in the global mean temperature. Lots to learn and study. Not much of a consensus given the uncertainties. I am concerned that the models are not winnowing down to those that more closely adhere to the empirical data.
Scott

No, but after watching the shenanigans of the IPCC, the creme de la creme of Climate Science, and the politicians with which they are symbiotic, it would be well to remember the old saw: ‘Figgers don’t lie, but liars figger.’

In this best-case-for-natural-variability scenario, natural variability is strong enough to bring global warming to a halt during the negative portion of the natural oscillation, and the present halt is likely to continue for at least another decade. Conversely, during the positive portion of the natural oscillation, natural variability and anthropogenic forcing are working together to drive temperatures upward.

Uh, no.

A better-case-for-natural-variability scenario is one where the apparent 60 year cycle is strong enough to bring ‘global warming’ to a halt at the inflection point of the natural oscillation, and then reverse it for some years or decades on the negative swing.

In an even-better-yet-case-for-natural-variability scenario, natural variability is composed of more than one 60 year, zero-sum, ocean mediated cycle, and includes additional non-anthropogenic climate trends. You know, like the readily apparent pre-SUV rise in GAST climbing out of the LIA. Or the CO2-immune descent into it, for that matter.

The likes of JN-G find such stickmen and brickmen to be bogeymen. Huffing and puffing does not blow such men down, and the JN-Gs well know it. So they play with dolls of straw and pretend all is well. And they do this from under the bed covers, lest they accidently cast eyes on the MWP golem striding from the closet with a broken hockey stick in its hands. Horrors!

The pause needs an explanation, which AGW does not meet. That’s why the pause is killing the cause. But don’t give up, lollie. By the way, how many 60 year climate cycles can you point out that netted out to zero?

The late 20th century warming did. The latter needs a decent multi-decadal explanation, which AGW more than meets.

As did the early 20th century warming. It also needs an explanation, which – as with the late 20th century warming – AGW does not meet.

Only an idiot would rest their case on the notion that there is no significant natural variation unaccounted for in current climate models. The idiots that did that are now forced into being the idiots that must rest their case on the notion that there is no significant natural variation unaccounted for in current climate models, except for one period 60 cycle.

Science will continue to progress by determining exactly what kind of idiot these idiots are going to be next.

Too many people focus on CO2 and the deficiencies thereof. We must acknowledge that fossil fuels are burned primarily for the heat content, and that the heat emissions alone, from our energy use are much more than enough to account for temperature rise, melting glaciers, etc. To totally discount this primary variable requires a satisfactory explanation how this heat is ultimately dissipated without leaving evidence of its previous presence. So far I haven’t seen one.

So…we don’t understand climate yet, i mean natural climate. Especially if we don’t want to take mankind into account, i mean natural mankind ; especially if we don’t wan’t to take humane CO2 emissions into account in mean the artificial ones.

We should look at recent weather and understand that, according to Dr. Tim Ball (WND radio), this “major cooling cycle could match [the] Little Ice Age… If you look at the historic record, and I mean going over 10,000 years, this pattern occurs as the earth starts its cooling down process. And that’s what’s going to happen,” Ball said. “We’re going to be in this cooling until at least 2040.”

We’ve not had this level of GH gases in the past 10,000 years, or the past 800,000, nor at any time in the Quaternary Period. Tim Ball will have to look back much further than 10,000 years, like 3.2 Million years to the Pliocene to see how periods of cooling behaved when GH gas levels were so high. More probable that Tim Ball is wrong than right related to his 2040 number– though AGW skeptics will rally round any notion that such a long period of cooling could be ahead.

… “this level” as you say is in ppm, most of which is natural. It would be far more logical to blame global warming on the disappearance of carrier pigeons in as much as their numbers went from billions to zero — a truly significant change — and, all at the hands of man.

There is only a very very small chance that the current high levels of GH gases were caused by anything other than human activity. There is also only very very small chance that they are not at their highest in the entire Quaternary Period.

But, they’re not high. Water vapor — accounting for 95% of all greenhouse gases — is a far bigger player.

“Carbon dioxide is 0.000383 of our atmosphere by volume (0.038 percent) … Only 2.75 percent of atmospheric CO2 is anthropogenic in origin … If the atmosphere was a 100-story building, our anthropogenic CO2 contribution today would be equivalent to the linoleum on the first floor.” ~Reid Bryson

I somewhat agree with N-G, in that if you can capture the appropriate end points of a full cycle, the net increase is likely to be the “real” human component, and it is likely to be non-zero (this is why I consider myself a “lukewarmer”). I do not think the current models would do particularly well in that analysis, however, and bug-eyed demands to spend huge sums of money RIGHT NOW, HURRY, HURRY would likewise fall apart.

Btw, I would also happily sign up for JC’s “28-72%” range as well, tho I doubt I could defend the reasons for why it is reasonable as well as she could. But my visceral reaction to reading and internalizing all the blogs for several years now has long been in that range.

Why all this fuss about what John Nielsen-Gammon has to say. OK he is a professor from Texa, but was not called to testify to the recent Senate inquiry. IPCC supporters will no doubt disagree with what Judith had to say, but J0hn seems to be mostly in agreement.

The ‘greenhouse gas’ theory seems to have survived the natural vs anthropogenic forcing argument, despite that it is so tenuous. It is tenuous because the IPC C has failed to show any functional relationship between the absorption modes of the CO2 molecule and tropospheric temperatures. Here the availability of vibration modes of the CO2 molecule are vital to the argument, yet no one mentions it. The two on//off.modes of warming during the 20th and 21st centuries ought to be sufficient warning of the importance of understanding the vibration modes of the CO2 molecule.

He’s our state climatologist. He should have been called to testify. The lady from Texas who testified should probably stay away from Sweetwater. It’s wind country. You can’t step over more two dozen rattlesnakes without running smack dab into another wind turbine. In Sweetwater, two dozen rattlesnakes can be as little as 100′.

Didn’t you notice that it was your nemesis Judith’s comment, joshie? Surely, you can find something wrong with it. What is she advocating? Does her praise for N-G pass the selective reasoning test? Come on joshie. Think!
Don’t let us down.

John N-G is someone I hold in high regard, I pay attention to what he has to say.

John N-G is a member of our NASA Retiree Climate Study Group and I hold his opinion in high regard, but we do disagree about some stuff. He is a Climate Scientist who will discuss and debate with people who disagrees. He is still a Scientist. Consensus people who do not discuss and debate are not scientists.

Been away in the caravan doing some serious sightseeing, fishing and camping at beautiful places Joshua. Have been lurking but find much of the stuff on climate change just a bit repetitive and that I have nothing useful to add these days.

Been away in the caravan doing some serious sightseeing, fishing and camping at beautiful places Joshua. </blockquote?

Sounds rough.

Only got to Australia once so far, when a much longer expected stay got cut short by a series of unfortunate events that took a serious bite out of my travel funds – necessitating moving on my quickly than I wanted to cheaper places to travel like Thailand and Indonesia.

On my bucket list is a return trip of maybe six months with a vehicle. The longer that takes and the older I get the more likely that will be a caravan; I used to look down my nose at car-camping but I'm not so proud anymore.

…find much of the stuff on climate change just a bit repetitive and that I have nothing useful to add these days.

When I get away from the climate wars nonsense for an extended period and then come back, it’s always amazing to me just how much nothing has changed.

Unfortunately, I have a habit of not letting having nothing useful to add prevent me from participating. Of course, if everyone followed your example there would be precious few comments made, and imagine how much worse off the world would be without Jell-O flinging in climate war blog comments.

I’ll shoot an email for tips before I go. Asking any of my other much beloved Aussie “denizens” would more than likely end up with me in an alligator-filled swimming hole that I dove into so as to sooth the tarantula bites I got after being chased by rampaging cassowarys. (No wonder you folks are so ornery).

Sorry to hear about your misadventure in Australia. It sounds like you were much younger and less wise then. It costs more to travel in Australia and the distances between attractions are generally much greater than in Europe, Canada or the US.

The cities are not notable for architecture except perhaps Sydney but the system of state and national parks around the countryside is comprehensive and well managed, with a huge number of very cheap or free camping sites to enjoy.

My plan was to buy a car in Sydney and head out to Cairns, Townsville, and Darwin (my favorite place on my first visit) and via Melbourne, Adelaide, Alice Springs, etc. (places I never got to on my first visit) before selling the car in Perth. I figured it would take six months.

I’d read that as:
Never underestimate someone who might have a logical opinion that varies from your own especially if that disagreement is relatively small in comparison with all other opinions you both have.

I think you have misunderstood John N-G’s point if you think that the disagreement He has with the logic of Dr Judith Curry’s position is relatively small.

Because natural cycles {or quasi-periodic fluctuation of characteristic duration and magnitude} create an energy imbalance that negates the warming/cooling they cause, they are by definition thermodynamically neutral.
They can alter the rate of warming or cooling from an external forcing, but not alter the total amount of energy change. Therefore any trend over a time period of a cycle length must ALL be attributable to the external forcing, not the natural variation.
That can only change the pattern of variation in the trend until climate reaches a new energy equilibrium in response to an external forcing.

There is no reason to assume that there is any manmade warming at all and there is no reason to assume nature shouldn’t warm the planet for 0.6K/century with no help from us. So J N-G is merely building a conclusion directly from a baseless assumption.

As ever they try to argue black is white and up is down but they only fool themselves. The alternative is to admit they’ve been teaching the wrong thing for 30 years.

There are numerous things here I should comment on but I want to take on the the idea that “Natural Variability” is an excuse for leaving things unexplained. That is pure nonsense and an indication of lazyness. There are certain natural phenomena that we know about, have long-term measurements of, and know their effects upon climate phenomena. One of them is the partial pressure of carbon dioxide in the atmosphere. Since 1958 it has been measured at Mauna Loa and for earler times there are ice core data from Law Dome and other sources. The global warming hypothesis says that this atmospheric carbon dioxide absorbs outgoing infrared radiation and thereby causes greenhouse warming. And the more of it we make and put in the air the more warming we will get. But if you put the Keeling curve and its extension on the same graph with temperature it is very clear that they do not match. Why not? Theory says that greenhouse warming and rising carbon dioxide should be parallel but temperature goes up and down every which way while CO2 rises smoothly. Is this “natural variation” that is playing tricks with us? There are some who implicitly take that route when they approximate sections of the temperature graph with a straight line or use computer smoothing to get rid of irregularities. Both are an expression of arrogance and stupidity. First, there are “regular” irregularities that are easy ti understand. If you have a decent copy of global temperature going back to the nineteenth century you can’t miss the sawtooth pattern in it. What you are looking at is the stamp of the ENSO oscillation that has existed since the Panamanian Seaway closed. All the peaks are El Ninos and all the valleys are La Ninas. To obtain global mean temperature which is needed to determine trends you must put a dot at the center of every line connecting an El Nino peak with its neighboring La Nina valley and then connect the dots. The amplitude and phase of these features both vary which makes computer smoothing unreliable. If there is an underlying temperature increase or decrease it will show up as an upward or downward trend of this sequence of dots. If there is neither warning nor cooling, the dots will fit a horizontal straight line. If you now look at the temperature chart for the twentieth century where temperature does seem irregular some facts stand out. There are several instances where warming suddenly starts. The first such startup takes place in 1910. It is a clean startup. It was preceded by ten years of cooling. This is quite a long stretch of warming and it continues until 1940. There it suddenly stops and temperature plunges to the cold winter regime of World War II. Most literature tells us that total warming during the entire twentieth century was 0.8 degrees Celsius. The early century warming from 1910 to 1940 raised global temperature by 0.5 degrees, substantially more than half of the total warming for the entire century. Question: was this warming a “natural” warming or a “man-made” warming? To answer it you must know how to distinguish natural and man-made warming from one another, and here is where knowledge of the Keeling curve is needed. Radiation laws of physics specify that if you want to start any greenhouse warming you must add carbon dioxide to the atmosphere. That is necessary because the absorbency of the gas in IR is a property of its molecules and cannot be changed. Accordingly, checking the Keeling curve extension for 1910 we find that it is completely smooth – no sign of any increase of CO2. It follows therefore that under no cicumstances can this warming be considered greenhouse warming. This is reinforced by the fact that the warming stopped suddenly in 1940. Once a greenhouse warming gets started it is almost impossible to stop it because this requires plucking out all those CO2 molecules from the air. This is just one example where we were able to identify the type of warming involved. There is more to it. This should be applied to all other parts of the global warming curve. These guys have billions of dollars of research money and there is no reason why they could not apply it for this purpose.

Meh, time for an oldie but goodie: Ignore the millennial at your perennial.

No explanation here for the rise from the depths of the Little Ice Age. If it’s all attributed to AnthroGHGs, think how cold we’d be without it. If it’s not from AnthroGHGs, or not much of it, where is JN-G?

Where besides always looking to justify the alarmist scenario. Gotta respect him for that, the poor thing.
==========

“Not hard to figure out when the AMO goes cold in sync with the PDO in about 6 years time what’s going to happen.”
——
This would be the great hope of “skeptics”– that something, anything will switch back, turn around, go cool, etc., to disprove AGW. But here’s the thing: the “natural” variability of Earth 2014 does not equal the natural variability of Earth 1950 or Earth 1750 or Earth 1250. Humans have and continue to radically alter the atmosphere and biosphere and hydrosphere. The planet is much different and so the “natural” cycles of the planet are also bound to be different. The closest analogue we can find is 3.2 mya.

Guys there is no need to go into the radiative science at all. We have effectively tested the theory by experiment. A large plug of CO2 was injected and it was expected to increase the rate of temperature rise. However it reduced the rate. Stratospheric cooling was supposed to continue but it stopped in 1995. The Argo float temps were supposed to take away the bucket-based guesswork and the natural cyclic noise and show an unequivocal warming signal but they initially showed a cooling then after adjustment they showed insignificant warming. The tropical troposphere was supposed to show evidence of a large positive feedback of water vapour yet it didn’t appear.

These are real experiments and the hypothesis flunked every single one. The warming is natural and equal to 0.6K/century. Nature gets itself into and out of ice ages all by itself, it produces warm and cool periods in between all by itself. Whatever the mechanism of natural variation is, the climate scientists have only proved just how little they really know about nature and how easily led they can be by a few years of warming between 1975 and 1998 which (hopefully) they now realise was merely the start of a perfectly natural pdo cycle.

Why these ‘scientists’ can’t connect the dots is mostly to do with the fact that most of them would not be employed without the CO2 scare. Some of us predicted this outcome as soon as we heard the new ice age scare had run it’s course due to the planet warming up. Some of us fully expected these natural cycles to demonstrate a downturn from time to time. At some point the consensus will return to the dominant sun theory.

“Our analysis of the latest satellite datasets and model simulations reveals that a model-predicted anthropogenic fingerprint pattern is consistently identifiable, with high statistical confidence, in the changing thermal structure of the atmosphere. Multidecadal tropospheric warming and lower stratospheric cooling are the main features of this fingerprint.”

R. Gates, I reckon Ben Santer was relieved to find that “fingerprint” of anthropogenic influence. Some though seem to think that the fingerprint is smudged a touch given the rate of warming/cooling are more consistent with a lower “sensitivity” instead of the alarming rate “forecast” prior to the 1995 shift in the rate of stratospheric cooling.

It takes more like a hundred centuries to come out of an Ice Age, and the warming rate is more like 0.06 C per century, a tenth of what we saw in the last century. It just needs the correct numbers to give the correct perspective, otherwise you can hopelessly mislead yourself.

Brandon and GaryM should consider reviewing the work of members of The American Statistical Association(ASA) Advisory Committee on Climate Change Policy to see if any are “CAGW advocates.” If so, Brandon and GaryM could serve science by pointing out how bad these guys are at combining statistics and logic.

Max_OK, I freely accept the possibility people know about the issue I highlight. If so, they aren’t bad at logic or statistics. They’re just culpable for their silent tolerance of bad logic and statistics. I don’t really care to figure out which applies to who.

As for the second link, that’s an annoying artifact introduced by WordPress. If you don’t begin a link properly (such as by leaving off http://www), it automatically appends your link to the current page’s URL. It’s silly.

It’s easy to fix if you edit the URL it sends you too, but here’s a fixed version.

Nielsen-Gammon then goes into a relatively lengthy argument
…
But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part. Thus the IPCC can and should consider it to be extremely likely that human influence dominates the net rise in temperature over the past 60 years.
*****
So here, N-G assumes there are no natural cycles greater than 60 years. This seems unlikely given the century-scale length of some ocean cycles.

The skeptics have postulated a natural variability that is positive, but has no explanation in terms of energy balances, and they seem very confident in it being only coincidentally in the same period that CO2 has risen.

DM, you are free to look for natural cycles and point them out. They are inconsequential unless their amplitude is 1 or more degrees which then starts to compare with the low end of doubling-CO2 estimates, which is what this is about after all.

@ Jim D | January 23, 2014 at 10:42 pm |
The 150 year temperature record shows no evidence of cycles greater than 60 years,
*****
A mere 150 years just isn’t enough. Sorry. But if you don’t have the data to detect long cycles, you can’t tell if they are there or not.

“If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero. The warming during part of the cycle is cancelled by cooling during the other part of the cycle. What’s left is the long-term rise caused by man.”

I’m just wondering what the human influences were that prevented the Earth’s natural climate cycles from netting out to zero for millions of years before we even existed. Or have climate scientists used their special climate models to prove there were never any natural, regular, repeatable climate cycles until man appeared?

Now what do you see? A projection from 1950 to the current day, which follows all the temperature fluctuations, culminating in the infamous “hiatus” of the past 15 years. That is no coincidence, as the Cause of the Pause is explainable by thermodynamic Laws.

What does your model or hypothesis or unpublished unreviewed whatever say will happen in the next 5 years? In order to be eligible for big boy pants you have to go on record with a testable prediction of the future ya know.

What does your model or hypothesis or unpublished unreviewed whatever say will happen in the next 5 years? In order to be eligible for big boy pants you have to go on record with a testable prediction of the future ya know.
”

Write it down that the CSALT model does require knowledge of CO2, SOI, Aerosol activity, LOD, and TSI, all of which are not available before 1866.
Write it down that, except for CO2, each one of these will fluctuate pseudo-predictably in the future, making CSALT more useful for longer term predictions.

Write it down that this is a remarkable achievement which only depends on historically bounded factors, plus one, CO2, that is unbounded. What does that mean? That none of the fluctuation terms matter too much, as the control knob of CO2 will establish the long term trend. That remains the standard model, one that CSALT helps to substantiate.

“And he has already been to Oslo, where he nearly froze to death on a park bench.”

I was in Oslo on a business trip in the summer of 1991. Two things I clearly remember is liquor was nanny-taxed so prohibitively high that the middle class is excluded from regular moderate drinking and/or infrequent excessive drinking. The other thing is I needed to close the blinds on my hotel room windows at midnight before going to sleep because it was still daylight outside. I’d like to say it was a nice place to visit but I’d be lying.

In my lines of work, I never went to Oslo. The people I was after didn’t frequent the cold countries, except Afghanistan. I did have a 6′ tall blondie Norwegian girlfriend, who gave me a flavor for the place:)

Yes, I do know that this comment is pointless … Jimmy carries right on with new goalposts. Only two parameters required: 1) no obvious physical law is crossed (geological knowledge is irrelevant); 2) there are no empirical data to test the new goalposts against

This last point reminds me irresistibly of Stephen Hawking’s comment: he is of the view that time travel is impossible, and this keeps the world safe for historians

Vaughn Pratt is going to get the Nobel Prize for his quasi-theory on that quasi-sawtooth thing. And in the meantime, we will just assume that he and the undercover DIY scientist, jimmy dee, are right. Cycles? What cycles?

A realistic CO2 TCR of 1.5-3 C gives a warming of 0.5-1.0 C since 1950. The upper end exceeds the actual warming of 0.6-0.7 C and would allow for negative natural variation and other factors like aerosols outweighing positive factors like other GHGs. The lower end requires positive factors to account for the other 0.2 C, but these add up to less than the CO2 contribution. However, this sensitivity range does bracket the actual warming, allowing for other less justifiable factors to be either sign, but smaller, and possibly net zero.

This is the IPCC AR5 SPM attribution statement.
“Greenhouse gases contributed a global mean surface warming likely to be in the range of 0.5°C to 1.3°C over the period
1951 to 2010, with the contributions from other anthropogenic forcings, including the cooling effect of aerosols, likely to
be in the range of −0.6°C to 0.1°C. The contribution from natural forcings is likely to be in the range of −0.1°C to 0.1°C,
and from natural internal variability is likely to be in the range of −0.1°C to 0.1°C. Together these assessed contributions
are consistent with the observed warming of approximately 0.6°C to 0.7°C over this period.”

From this we see that it can exceed 100% from GHGs alone and the middle of the range is over 100%. Adding aerosols brings it down to 100% with natural variations at -0.1 to +0.1 C.

I’d agree with Jim D that the modern (150+ year) record shows cycles of warming and slight cooling of about 60 years. All on a tilted axis, which shows roughly 0.7C underlying warming since the record started in 1850.

But the observed amplitude is more like +/-0.2C to 0.25C, rather than 0.1C.

I didn’t mean to fall silent – that doesn’t sound like me – although I note very little has changed here. It was a combination of foot in the air after an operation and rebuilding my laptop.

If we assume that the ‘cycles’ zero out – over the last ‘cycle’ from 1946 to 1998 the rate of increase in surface temps was 0.07 degrees C/decade. Although the chances that we are on the threshold of Bond Event Zero – as the Sun cools and La Nina intensifies over centuries – seem quite good.

But these are not cycles but abrupt changes in a complex dynamical system – climate in the class of systems defined by complexity theory.

‘Weather changes abruptly from day to day, and there is no basic difficulty in understanding such changes because they involve a “fast” and easily observed part of the climate system (e.g., clouds and precipitation). But mechanisms behind abrupt climate change must surmount a fundamental hurdle in that they must alter the working of a “slow” (i.e., persistent) component of the climate system (e.g., ocean fluxes) but must do so rapidly. Two key components of the climate system are oceans and land ice. In addition, the atmospheric response is a crucial ingredient in the mix of mechanisms that might lead to abrupt climate change because the atmosphere knits together the behavior of the other components. The atmosphere potentially also gives rise to threshold behavior in the system, whereby gradual changes in forcing yield nearly discontinuous changes in response.’ http://www.nap.edu/openbook.php?record_id=10136&page=73

These abrupt changes in system dynamics – that include significant changes in ocean and atmospheric circulation, cloud over and energy budgets – tend to last 20 to 40 years in the long proxy records. So it seems that another decade to three of non warning is more probable than not – before another unpredictable shift to a new climate state.

“But these are not cycles but abrupt changes in a complex dynamical system ”
—–
You are referring to a dragon king event? Are you suggesting that a Bond event zero is a likely soon-to-occur dragon king event?

R. Gates, “You are referring to a dragon king event? Are you suggesting that a Bond event zero is a likely soon-to-occur dragon king event?”

How would you differentiate a Dragon King versus a Black Swan versus an over confident screw up? Since most of the paleo data indicates a range of normal variability should be +/- 0.6 C and a major once in millennia event should be around 3C, using the less than perfect “lower troposphere mean temperature”, I would think we are dealing with a more an over confident screw up which can also be called a black swan by the ones doing the screwing up.

Dragon-kings are defined by Didier Sornette as extreme outliers that occur in the vicinity of tipping points. Otherwise known as noisy bifurcation. They have been used as a diagnostic for climate bifurcation.

I am talking more about the topology of strange attractors. The state space that climate falls into in response to shifts in forcing at many scales including millennial. One of many quasi equilibrium states.

Well, this is just getting silly. Didn’t you people get the memo that the debate is over? What pause you talkin bout? If you look at the results genned up by our multitude of lavishly funded climate modeling Teams with their big fancy supercomputers, you can clearly see that the stray noodles at the bottom of that tangled ball of spaghetti are in the ballpark with our Team’s homogenized reanalysis of our own serially adjusted and anomalized temperature data. Clearly our .2C per decade projections woulda happened, if not for natural variability, which our Team has recently discovered. So let’s stop talking about quasi-sawtooth this-and-that and get on with our drastic mitigation schemes. Time is running out. And please holler at Mayor DeBlasio to send the snow plows. We are gettin buried here.

“So every time you hear someone mention the Pacific Decadal Oscillation, or perhaps the El Nino ~ Southern Oscillation, the Atlantic Multi Decadal Oscillation, the Arctic and Antarctic Oscillation, the Madden Julian Oscillation, the Quasi Biannual Oscillation, or any other so called “natural cycle” of the climate system…….”.

Throw in a few la Nina’s, and such, then ask the question: Does naming an observed pattern mean that we now understand what is going on? There seems to be a lot of data collection, curve fitting, and trend line plotting, but precious little explaining. Why do all these periodic oscillations oscillate periodically? Can we predict them or do we simply monitor air temperatures, sea temperatures and ocean currents and announce when they change? Are they in theory predictable?

There seems to be a lot of stock in sunspot cycles. Great. The observed periodicity in sunspot number seems to be correlated with variations in the earth’s climate. Why? What is the sun doing in the process of becoming more or less spotty that is coupled to our climate? And how? Since CO2 is allegedly the power behind the earth’s thermal throne are gnomes secretly monitoring the sunspots and cranking the geologic emission of CO2 up and down to confuse us? Since we apparently have no idea, other than correlation, how the number of sunspots affect the climate, do we understand everything ELSE that the sun does to affect our climate and why? Is the TOTAL solar irradiance the only thing that matters or is the spectral distribution of the total important? Does it vary? Is solar wind important? Variations in the solar magnetic field? CME’s? Flares? Cosmic rays? Is there anything else going on with the sun that affects our climate? How? Are all things solar that affect our climate constants or variables? Can we measure them and are they predictable? If they affect the climate and are not predictable, can we predict the climate?

Until we understand WHY all these variations in climate happen, rather than just naming them, it seems unlikely that climate science will ever go beyond collecting, adjusting (very important), and curve fitting climate data and announcing, yet again, no matter the slope of the temperature trend line, ‘See! this PROVES it was ACO2!’.

Bob, my main point (which seems to never get across!) is that under virtually no circumstances does the mainstream theory allow internal signals to produce long term change.
But that would mean all change must be due to external forcing. Given the rich proxy data available to the science, it would seem this notion is a complete con job.
The idea that climate must be forced by external forcing is perhaps the single most embarrassing idea real climate scientists have ever come up with, and I’ve just about had enough of it.

I suspect you have to broaden the explanations of the cause of the recent lack of surface warming to fields beyond thermodynamics. Evidently chemistry, biology and solar physics have an impact. Not being an expert I expect there would be additional fields one can throw in this pot.

I was informed by a reliable source that they have consulted with witch doctors to help them explain the pause. Don’t know what they came up with. If it keeps up another six months, I am sure we will hear about it.

“But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made part.”

Nielsen-Gammon
If that’s true there wouldn’t be a Little Ice Age and a Medieval Warm Period. Natural variability will cancel itself every 60 years or so and global climate will never change in longer periods. Historical data disprove this. The climate is always changing in longer periods due to natural variability.

Beat me to it also. I enjoyed Judith’s endorsement of the John N-G competence and liked his web page but the fact remain. In this case he constructed a straw man and proceded to demolish it to win an arguement with himself. This competing causes are complex. Given long slow thaw, variations in the last 10,000 years and orbital mechanics there are lots to discuss without making up irrelevant stuff
Scott

Since the turn of the century it has been observed that global temperature has consistently been flatlining while month by month warming CO2 emissions have continued to escalate.

The overall trend is pretty flat but the temperature has not been “flatlining” – there has still been a lot of variation on a year by year basis. 2005 and 2010 are the two warmest years on record, other years have been relatively cool. Month by month there is, as one would expect, even more variation. So therefore the first possible explanation you mention…

The warming due to CO2 is compensated by an effect of unknown provenance which month by month provides just the right amount of cooling necessary to neutralize it.

is not quite correct. It just needs an overall cooling trend due to “natural” factors which roughly matches the warming trend from CO2 over the given period, it doesn’t have to compensate on a month by month basis.

I can see why intelligent people would pick door number two, but I read the Monty Hall problem and I think they said if overwhelming is behind your first pick you gotsta switch. Or was it a goat? Now I’m confused.

I extended the temperature anomaly back to 1890 so you all could see what the PDO would do, which is peak a year or so before the the temperature anomaly. So imagine the red trend extending back and bumping up just before the blue trend: like the red guy is the boss of the blue guy.

So for the 20th century the PDO ordered:

1. ~20 years down from around 1897 to 1910
2. ~20 years up from 1910 to 1940
3. ~20 years down from 1940 to 1960
4. ~20 years up from 1960 to 1980
5. ~20 years down from 1980 to 2000

JCH, the AMO is also related to fish stocks. There is even an indication using the Cod fishery that there is more to the AMO than just the obvious ~60 year pseudo-cycle. Cod were so thick in the 16th century that you could net them. Then that could just be a fishing story. Of course for the cod to be on the surface their food would have been up there to. When the Cod are deep, their food supply is deep. Thermohaline, thermoclines and all that mysterious ocean stuff that happens over 71% of a water world.

I don’t think that’s necessarily true. There was still a definite warming trend until about 2006/7, it’s some of the years since then which have been relatively cool and so have skewed the trend downwards.

Andrew – the hottest year in the instrument record is 2010, which happened because of an El Nino that started in 2009 and ended early in 2010. From that point the temperature anomaly collapsed down into what became the 2nd strongest La Nina in the instrument record, which followed by another La Nina.

This staggered the ACO2 champ, but he has been quickly regaining his senses, and he has been puttin’ it to the short-lived contender ever since.

JCH, “I thought you koolers were big fans of the PDO. Now you’re turning your back to it. You’re going to piss it off. Cycles: they go down; they go up.”

The PDO is just one example of something that was supposed to not have a significant impact that does. You can call it a butt biter. Shifting from using the ensemble mean to individual runs in an attempt to revive faltering confidence intervals is also a butt biter. Most “skeptics” are just looking for honest evaluations of the situation not Websterizations.

The fact of the matter is that ENSO is providing a thermodynamic impulse to the rest of the planet and the temperature adjusts depending on the sign of the impulse. The value of the SOI gives the signed magnitude of the impulse, with the long term trend automatically removed.

The gradual warming of the North and tropical Atlantic Ocean is contributing to climate change in Antarctica, a team of New York University scientists supported by the National Science Foundation has concluded.

“And where does examining the physical behaviour of the earth’s climate system end and just playing with graphs begin?”

Or in reality, when do climate scientists quit playing with graphs and assigning responsibility for changes based on polling themselves and start examining the physical behavior of the climate system?

Are we supposed to assume that when Climate Science assigns a bunch of keen acronyms, like ENSO, el Nino, la Nina, ENSO, PDO, AMO et al to the various goings on in the oceans and atmosphere that they in fact understand what is going on?

But here’s the thing. If, over 60 years, natural variability averages out to zero,
.
This is obviously the central issue and the crushing majority of the posters noticed it.
But here’s the thing. A sentence starting with if has its opposite starting with if not.
As there are only these possibilities, which of the 2 statements is more convincing?
Well there are 3 reasons why the hypothesis after if is not convincing at all.
.
– 60 is an arbitrary number that appears only because the available temperature data has barely enough length to look for 60 years in Fourrier analysis. More than that can’t be seen anyway regardless whether it is there or not. It used to be 30 years for no better reason in the past.
.
– A Fourrier spectrum of chaotic systems has many peaks, they are not sharp and they change with time. Focussing just on one and even if it existed at time T0, is guaranteed to misunderstand the whole dynamics.
.
– There is no theoretical reason that a pseudoperiodic oscillation emerging in a chaotic system “averages to zero”. The only pseudo argument I heard concerns the average temperature (not other degrees of freedom) and it says that the system must have “energy in = energy out”. But like D.Hughes rightly said this is only true for systems with constant internal energy. The Earth system has obviously not a constant internal energy so there is absolutely no reason why an “average temperature” (let us not even mention albedo, ice mass and extent, cloudiness etc) should just oscillate around a zero mean.
.
A much more reasonable assumption and one that is supported by theoretical considerations from non linear dynamics is that the system exhibits many coupled pseudoperiodical oscillations (some less than 60 years and some more), and that these pseudo oscillations never keep a constant average for a long time.

There is no way that chaos is going to do anything but provide a small perturbation to the temperature record. The denialists simply need this mechanism or any semblance of a counter-theory to AGW evaporates.

All they really have in terms of chaos is the possibility that AGW leads to CAGW as the control knob of CO2 pushes the albedo towards a positive feedback tipping-point. No indication that is happening yet, but Tomas got himself in this trick-box all by hisself. Boo hoo.

Personally I think it only has one: ENSO. Everything else tags along or it beats it up.
”

Right, JCH. That and the Stadium Wave are the only unknowns as far as I am concerned. ENSO plays a factor in the global temperature as reflected by the SOI, which is a highly significant term in the CSALT model, along with the Stadium Wave LOD term.http://contextearth.com/2013/10/26/csalt-model/

Yet the SOI also is bounded and reverts to a mean of zero, as it is completely based on atmospheric pressure differences. This means that the SOI-derived fluctuations factoring into the global surface temperature are also bounded. Unless we want to see world-record pressure differences in the South Pacific, skeptics won’t be able to lean on this forever. Eventually the SOI will return to a neutral mean of zero, and the control knob of CO2 will again clearly show its hand.

The AMO doesn’t do anything unless the PDO tells it to do it. It’s a lackey. A wussie. No muscle. No backbone. The big dawg is ENSO. ENSO walks into a biker bar and and kicks ass. AMO walks into a biker bar and sucks a you know what in an attempt to survive. Big difference, huh poet?

JCH, “The AMO doesn’t do anything unless the PDO tells it to do it. It’s a lackey.”

I suppose you have extensive references to back that up with almost too good to be true confidence intervals. If you ever get curious though, the difference between the Indo-Pacific Warm Pool and “global” temperatures is the combination of the PDO and AMO. Due to the land amplification of the AMO, its signal appears to be in the driver’s seat. The AMO and PDO are coupled pseudo-cyclic oscillations so what is driving what can get somewhat complex.

Some might think that the climate science tools are just not up to the task. Then that could just be me.

JCH observes well. The AMO is irrelevant in comparison to ENSO in driving the major fluctuations.

If we can get a prediction of ENSO over the short term and LOD over the long term, that would do a lot of good for forecasting. Solar activity as on the verge of becoming predictable, but volcanic is likely not. CO2 is based on projections of economic activity and reserves.

Due to the different the the specific heat capacity of water versus soil and the difference in altitude, the land areas in the 30 to 60 north region tend to amplify the impact of SST warming as well as CO2 equivalent forcing. Land use, primarily changes in soil moisture/water sheds also contribute. So instead of “polar” amplification you have Arctic and mid to upper land amplification.

Since the Antarctic is not amplifying temperatures, the influence of SST and land use are likely more important than previously estimated. Because of that the AMO and PDO have a high correlation with “global” land surface temperature.

Mosh, the SOI time series might be best modelled as bounded red noise. The boundedness comes from an Ornstein-Uhlenbeck reversion to the mean property in the random walk.
There also could be multiple forcing functions all with different periods, which makes it not much different than the randomness in the aquatic waves themselves. If there is a wave peak, you will have a wave trough to balance it. No net forcing imbalance over the long term.

Both these cases having bounds separates the behaviour from the unbounded nature of CO2.

Conor, “Thanks for the feedback. This AMO looks exactly like the effects if reduced cloud mass allowing for greater insolation. Mmmm”

Clouds mass and/or general precipitation pattern changes. Since there is unexpected amplification due to whatever cause there would be a greater than anticipated range of natural variability possible. It is likely going to be hard to explain what causes what.

Man made changes to the air pressure in e medeterranian since 1902 due to Nile river dams altered 2 of the precursors to easterly wave strata cumulus , thus cloud cover over mid Atlantic, causing fluctuations in mean SST.

andrew, I’m not making my point well enough. If the hiatus continues (more or less) the relentless increase of CO2 emissions will have been countered by a corresponding relentless increase in some compensating effect, whatever it is. Not month by month I grant you, but nonetheless a proportional effect of increasing magnitude linked in some way to the continuing CO2 increase. That sounds to me like something directly triggered by the CO2 increase rather than natural variability.

Sure, if we were to get a much longer period with a fairly flat trend then we would have to consider that there may be some more persistent influence. I’m not sure it would indicate some kind of negative feedback triggered by CO2, that would seem to contradict what we know about past climate change, but it might indicate for example that certain cyclical factors might be more powerful than previously thought. But that “if” is doing a lot of heavy lifting there – we certainly don’t know that this will happen, this is pure speculation.

I remember years ago at Climate Audit I was scoffed at for saying that not only did we not know the magnitude of water vapor feedback, but we also did not even know for sure the sign. Memory fails, though; maybe I was thinking in the clouds.
=================

“I’m not making my point well enough. If the hiatus continues (more or less) the relentless increase of CO2 emissions will have been countered by a corresponding relentless increase in some compensating effect, whatever it is.”
—-
The natural feedback to very high GH gas levels is an enhancement to the hydrological cycle, increasing rock weathering, and sequestering CO2 back into the ocean, to become limestone, taking the carbon back into the lithosphere, which is where the human carbon volcano has been taking it out of. The problem is the HCV takes it out fast, but the rock-carbon cycles takes tens of thousands of years to put it back.

Your problem was that you uttered those scurrilous words “we don’t know”. What heresy. For some it is easy as one of Rachel Ray’s afternoon show recipes. Just have all the ingredients in the cupboard and the right proportions and you have it all figured out. Perfecto!

” If the hiatus continues (more or less) the relentless increase of CO2 emissions will have been countered by a corresponding relentless increase in some compensating effect, whatever it is.”

Or, until someone shows empirically that ACO2 has a measurable effect on the temperature of the earth, climate continues to wander about in fits and spurts, doing what it has always done and requiring no compensating effect, increasing or otherwise.

Endless repetitions of the mantra ‘ACO2 is the knob on the temperature of the earth.’ doesn’t, in the absence of actual data, make it any more true than stating it once. Especially since, faced with actual data, its truth depends upon a mystery effect that exactly compensates for the observed steady rise in CO2.

Until someone actually understands how the climate works instead of curve fitting and apportioning temperature variations among factors known to affect climate based on the ‘expert opinions’ of climate scientists who, as a rite of admission, accept the ‘ACO2 control knob’ as axiomatic, the whole subject serves as a textbook example of a ‘self-licking ice cream cone’.

Kim,
Was that reily your observations? Bright clouds vs dark clouds and athick ones versus light cover both impact differently. Models at this point don’t detect what generates them over the wide Pacific.

We don’t know the sign of the impacts until we can predict the causes of formation.
Scott

SCOTT I cam fill you in on the current state of clouds but can you fill me in on your qualifications – no demerit intended, its just that from reading some of these posts it has become shockingly apparent that the general understanding of the AMO is PROFOUNDLY wrong and I am trying to think of a useful way of exposing the error.

This is exactly what I mean by calling warmists the real climate change deniers. They have to deny climate change in order to make place for AGW (ACC + natural = 100%). More climate change, less AGW. Fortunately, it cannot be denied effectively (it’s basic education that climate changes at all time scales) and they have to project it on skeptics.

Both myself and my good friend Col Mo Hassan were independently being checked out for suspected heart attacks for the same reason. The 200,000 deaths that occurred in Sudan in 2011 and again in Mali a year later looked like were going to be repeated last year. He had been trying to convince Al Jazeera in Doha that the factors mentioned in the Nile Climate Engine were relevant to the Sahelian droughts. Since the same easterly wave formations that provided their rainy season also provide cloud cover across the equitotial Atlantic, thus a dearth of sub Saharan rain equates to increased Atlantic insolation. Start thinking of an additional 100w/m2 across 6,000,000 km2 and you are in the ball park. I do not have the background, facilities, finnance, oversight or input to move this project further, yet it’s merits are solid and of relevance to all seven billion of us. It is necessary for one of the serious research institutions to step in and deal with the events relevant to the planets largest weather system. Although this blog is abrim with dedicated and intelligent correspondents, we may be guilty of effectively indulging in banal gossip whilst ignoring the real problem.
Conor.

This kind of argument, where anthropogenic contribution is argued to exceed 100%, seems senseless to me, even if you are just looking at the period from 1975-2000 where there is a clear warming trend. It is something that arises from the over sensitivity of models to CO2 forcing combined with oversensitivity to aerosol forcing.

I don’t understand this sentence. Regardless of model shortcomings, if you have positive forcings and negative forcings then surely plus and minus percentages cannot be avoided.

As I understand it; if we know that the net warming = 1 and we have observed a negative forcing of -0.10 then wouldn’t the positive forcing need to be 1.10 ie 110%

Hi, Judy. I think Nielsen-Gammon is simply saying, natural variability cancels out across a 60-year time span. As the period from 1950 to today is very close to 60 years, this makes the natural variability contribution close to zero. Thus, the remaining human contribution is, arguably, more likely to be >50%.

A better way for the IPCC to say that might be: we are more confident that most of the warming since 1950 is anthropogenic; this is due to the fact that there is less total warming than we used to think.

Ted, you write ” I think Nielsen-Gammon is simply saying, natural variability cancels out across a 60-year time span. As the period from 1950 to today is very close to 60 years, this makes the natural variability contribution close to zero.”

It seems to me that there are at least 2 unjustified assumptions in this statement. First, that the PDO is symmetrical on a short term basis; that each half of the cycle exactly balances the other half every time. I have no idea why anyone would suppose this to be correct. It seems to me that it is far more likely that the cycles balance out over several cycles.

Second is the inherent assumption that the 60 year PDO cycle is the only one producing natural noise. How on earth can you justify this assumption, when we know almost nothing about long term cycles?

Hi, Jim. Just for clarification, it’s not my theory, but Nielsen-Gammon’s. Regardless, if one assumes anthropogenic … well, let’s look at it this way. Say warming from 1950 to 2000 was 0.7C, and you *think* most is human-caused. Then you might say 0.4C is anthropogenic, and 0.3C is natural. And you think it’s accelerating, so you might assign 0.10C per decade to the human-caused part. Then, from 2000 – 2013 temperatures didn’t increase. You STILL believe the human part is 0.10C per decade, which means temps *should* have gone up about 0.15C. But natural variability went the other way.

Even if you play with the numbers some, you can see how someone might think we are MORE certain that >50% of the 0.7C warming since 1950 is caused by humans. I think that must be what the IPCC did here.

Ted, you write “Say warming from 1950 to 2000 was 0.7C, and you *think* most is human-caused.”

Fair enough. I realise it was J N-G theorising. Unfortunately I don’t think any appreciable warming was human caused. So I observe the 0.7 C rise and “think” that none of it was human caused. Where does that lead? If we don’t know the details of all the natural variations, then we get nowhere, as we would with any other assumption.

The Left’s interpretation of reality that legitimized a view that carbon dioxide qualifies as a climate pollutant was the moment science was dead to them in their goal of achieving their political ends by any means.

FWIW
I followed John’s logic, but am puzzled as to why, after agreeing to Judy’s arguments totally, he assumes that the natural variability has a period of 60 years so that it comes back to zero. If this assumption is correct, I can follow his point. But why is it correct? We know that the sun operates with many cycles of varrying lengths, why can’t there be other natural variations of varrying durations?

Kevin, you write ” If this assumption is correct, I can follow his point. But why is it correct?”

Exactly. As I note above, there is all sorts of sloppy science in the way of unjustified assumptions in the paper discussed in this thread. And instead of condemning this sloppy science, the warmists, including our hostess, seem to be endorsing it.

Kevin,
Agree with you. His web page, John N-G seems reasonable if supporting the consensus, but why create an artificial cycle that adds to zero and say given that this follows? changes over the last 10,000 years don’t show that and if controversy like this one should stick to data and not simple analogies.
Scott

‘ut why is it correct? We know that the sun operates with many cycles of varrying lengths, why can’t there be other natural variations of varrying durations?”

1. of course there can be “other” variations. Just as there could be space aliens
2. Your job is to find those cycles
3. Then if you find those cycles your job is to show that they explain the data BETTER than an explanation that excludes those cycles.

#2 has not been done, #3 has not been done. Consequently the best science to date, while imperfect, explains the data better than all alternatives. Its our best, but not final, position

There are several dozen paleo records dating back a long way; a well-founded land-only instrumental record from BEST that goes back a fair way. These significant datasets provide sufficient basis to establish a basis for the (presumptive) natural variability up until the 1890’s.

Why the 1890’s?

The favorite apparent natural cycle (though that natural cycle is postulated by most on only the last 110 years of data, and most of that 110 years of data doesn’t actually match a 60-year cycle very well at all). Dr. Curry supposes AGW doesn’t start until 1975 (with bizarre certainty, looking at the data and considering what Dr. Curry considers an uncertainty argument). Other claims suggest earlier start, back until at least the 1950’s. Begging the question of a 60 year cycle, the removed cycle for the sake of comparison would be the one prior to 1950. Hence, 1890.

We can support an argument by inference for natural variability from all data available up to 1890. We can establish a hypothesis from the paleo and instrumental touchstones for a global natural temperature profile on what is, abysmally small compared to what it ought be, a sufficient dataset. It is then a trivial exercise to ask, based on temperatures since 1950, whether we can claim natural variability accounts for current observations.

Well, if the instrumental record is any indication, we must reject the hypothesis that natural variability accounts for any more than the most minor part of warming since 1890. Go ahead, all the numbers are right there, the removed cycle (presumptive) prior is there, do the Bayesian.

Put another way, how confident are we of the power of the blue line to explain the purple line without another factor?

Given various values of climate sensitivity, what level of climate sensitivity best explains the purple line, given the prior blue line? (Note that the purple line includes the last seventeen years.. indeed MOST of the purple line’s top end comes from the last seventeen years.. which the blue line also has no significant power to explain absent AGW.)

If we look at Marcott et al, we indeed see the blue line as part of a millennial downward trend up to 1890, and so on significant paleo records ought expect ‘natural variability’ to have power only to explain lines of its own level or lower, on any but the tiniest fraction of probability.

Use any random number generator and see if you can’t hunt-and-peck out similar results a substantial portion of the time.

Experienced graph-readers generally dismiss apparent cycles with less than three full cycles of fully explained fit. You have 130 years rising that while it looks pretty if you have wide enough error bars, when looked at closely shows that only about 110 years of it can be called part of the pattern, and even that poorly.

There used to be a solar cycle correlating with sunspots visible in the global temperature data, but that disappeared in the 1950’s. Toss in the residues of that degenerate solar influence, and you’d expect to fit something on some part of the record.

Should Marcott et al. be reissued without the glaring errors that might be interesting, but not properly dating “Before Present” is a fairly serious screw up. It is one of those GIGO issues, once you find garbage you can stop reading the “peer reviewed” science.

Not valid, and even if it had been valid, not relevant, as you can take each component dataset from Marcott’s sources separately and apply the Bayesian, which is the argument I presented. Now, you can attempt to make this a discussion of Marcott’s methods, but it’d be a totally different conversation that doesn’t have the least to do with the challenge.

It’s simple: I’ve scoured Dr. Curry’s writings and can find zero evidence that Dr. Curry has ever done any valid Bayesian mathematics at all in publication, leading to the suspicion that Dr. Curry can’t.

If that’s the case, I’m fine with that. A lot of distinguished physicists can’t do enough mathematics to work their way out of a paper bag. But if it’s the reason for Dr. Curry presenting entirely invalid arguments using the wrong form of analyses, then we should know.

BartR, “Not valid, and even if it had been valid, not relevant, as you can take each component dataset from Marcott’s sources separately and apply the Bayesian, which is the argument I presented.”

Then that has nothing to do with Marcott. Their attempt is flawed so you would be starting fresh, which is not a bad idea, but not what you said, you said a Bayesian analysis of Marcott which is flawed which would produce a flawed result. Not a valid challenge.

If you want to issue a Bayesian challenge there are much more entertaining possibilities like updating Annan’s climate sensitivity by guestimates, observations versus confidence interval range and a good number of others that will simply indicate that catastrophic AGW was over sold pretty much like everything is oversold in order to get attention.

Not really. I am always on the lookout for honest sales pitches but remain skeptical until I do my homework. We live in a buyer beware society and always have. Why would you think Climate Science is any different?

No particularly logical reason, that I can see, to think that a valid analysis lies in between two oversells. Seems to me that a valid analysis needs to be rooted in carefully qualified arguments, not by extrapolating some supposed middle ground between two analyses of opposing reasoning that each oversell.

I’ll start paying more attention to an individual “skeptic” when he/she owns up to “the hiatus in global warming” (as opposed to “a relatively short term hiatus in the longer-term trend of statistically significant increase in GAT’s) is an oversell. (Somehow whining that “They did it fiiiiirsst” just doesn’t quite cut it.)

Joshua, ” Seems to me that a valid analysis needs to be rooted in carefully qualified arguments, not by extrapolating some supposed middle ground between two analyses of opposing reasoning that each oversell.”

A valid analysis would be. But that generally doesn’t get the attention some think they deserve. Natural variability just doesn’t sell that well.

“The results of Anderson et al. (7) suggest that in the past, the westerlies shifted asymmetrically toward the south in response to a flatter temperature contrast between the hemispheres. The magnitude of the shift seems to have been very large. If there was a response to higher CO2 back then, it paled in comparison. Changes in the north-south temperature contrast today are not going to be as large as they were at the end of the last ice age, but even small changes could be an additional source of modern climate variability. ”

Arctic sea ice loss with Antarctic sea ice gain is a change in the north-south temperature contrast. Who’s carefully quantified arguments then to avoid mentioning that? Until your fairy unicorn makes everything fair in life, you need to look at all the evidence.

That may be, but I often see it discussed quite directly in the work of climate scientists (who are often attacked by “skeptics”). I also see that when they do talk about natural variability, that discussion is either exploited (as with the nonsense that Latif was predicting “global cooling”) or fallaciously attacked (as in “those climate scientists don’t know what their doing because they used conditional terms like “could” or “might”!!!!11!!!!111!!!”).

The fact that they discuss natural variability quite openly is part of what lends credence to their perspective, IMO. The fact that some “skeptics” often diminish, exploit, and ignore the discussion of natural variability on the part of climate scientists is part of what diminishes my trust in their skepticism.

Disagreement about the magnitude and impact of natural variability is science. And sure, the measure of that magnitude might be subject to biased reasoning.

But distinguishing the noise from the signal, so to speak, w/r/t bias in estimating that measure is only set back by the kind of nonsense that permeates the climate blogosphere.

Joshua, “The fact that they discuss natural variability quite openly is part of what lends credence to their perspective, IMO. ”

That depends on how they discuss it. It can’t be completely ignored but it can be marginalized. The confidence intervals, which nearly all models are pushing the lower bounds, give you an indication of whether natural variability has been properly considered, don’t look like it to me.

Recognition of the “pause”, “hiatus”, “standstill” was slow in coming with the “team” and prompted a rewording of the IPCC attribution statement, the subject of this post. Now they have 95% confidence that carbon related factors are responsible for at least half of the warming from 1951. Carbon related can include more that just CO2 equivalent forcing and are much more easily mitigated, i.e. aerosols, black carbon. That could be an evasive shift. Changing the baseline for the instrumental model comparison could be an evasion or incompetence, neither very comforting. In general there is a good deal of back peddling going on along with desperate attempts retain some credibility. So while you may think that “mentioning” natural variability inspires credibility, low-balling the impact with baseline shifts and misdirection doesn’t impress me in that way.

It’s not a surprise to me that people disagree on what’s logical, but I must say that I am surprised on, how easily everyone of us can believe that the only logical way of thinking on a very specific issue is what supports our general views on climate change.

It’s also clear that logical fallacies have a role in that. People, who wish to disagree with John N-G avoid actively understanding him, and disagree with their fabrication of his views, while those who wish to agree may also misinterpret him in their own way.

Having studied brain evolution and how we attach empirical values to neural memory circuits, it becomes evident that the human brains ability to rationalise or use logic is not only profoundly overrated but 100,000 years of the modern brains history demonstrates it is dangerous. Human development has made it’s most cignificant advances by testing and measuring relative gains. Be afraid of human intelligence: see Lt Col Custar. Astrology, witchcraft, Mohamed, Aztec etc etc etc. the desire to logically follow the idiot in front is a sad reoccurring theme.

…but I must say that I am surprised on, how easily everyone of us can believe that the only logical way of thinking on a very specific issue is what supports our general views on climate change.

I’m surprised that you’re surprised. Seems to me like the dominant characteristic of these debates, and shouted out in the vast majority of comments. It is also predicted by what we know about human psychology and cognition.

People, who wish to disagree with John N-G avoid actively understanding him, and disagree with their fabrication of his views,

Which is why, IMO, a precondition for any valid thesis is an accurate representation of alternative views (that meets with approval from one’s interlocutor) . The back-and-forth between Judith and JN-G has actually been relatively good in that aspect, unusually so. I would say that is because there is mutual respect, and it is much more difficult, although not impossible, to see good-faith exchanges w/r/t fair representation of alternative views when mutual respect is lacking. When mutual respect is lacking, it requires a stronger commitment to self-examination of biases.

What is so odd is that so many seem to think that there’s some way to reach a meaningful exchange of views via actively denigrating their interlocutor.

Governments and religions have arisen very recently in human evolution during the time we ( as a species ) transitioned from small groups ( clans ) to settlements to cites. But most of our evolution is from the small groups. Individuals didn’t fare too well on their own – fighting bears, hunting, collecting food, mating – all were better within a group. Consequently we evolved with a traits to keep us part of the group – don’t rock the boat, and adhere to if not advocate for the common understanding – EVEN IF IT WAS WRONG.

So while we have advanced brains for rational thought, at the same time we all have behaviors, forged by relentless natural selection, that are quite irrational.

Science is about repeatable objective measures, of course, which we hope to all converge on ( eventually ). But it’s not surprising to find continued tribalism which continues to trap thinking for all.

What is to debate? Trying to reason with energy-deniers is a fool’s errand. When and if the day comes we must admit there will never be another dam or nuclear power plant or more drilling – that’s the day we must admit America is broken.

Then, there will be no debate. That is when we learn there will always be more Leftists searching for happiness in their mythical liberal Utopia than there will be space on the backs of society’s productive who must shoulder all of the dead weight.

If that day comes we might well ask, “How did we get here?” Based on what we see with the debate about the role humanity plays in global warming, we will have arrived at our dead-end destination by following a map based on one untested pet theory after another — from a limitless supply of unverifiable, next to worthless and mostly dead wrong pet theories — about reality that passes for science among the credulous.

Wag,
Plenty to debate. Problem is communicating honest information with motivated activists. REagan won the argument for a while with a mix on charm and facts. Plus making sense. Free enterprise and capitalism are the foundation for the worlds rise in living standards. Phony fears are aimed at convincing the voting masses to cede control of the economy to an elite. The long term temperature shows some rise so changing hte focus to carbon emissions sidetracks the voters.
Scott

“Although some critics say the environmental movement has made a strategic error by focusing so much energy on the pipeline, no one disputes that the issue has helped a new breed of environmental organizations build a mostly young army eager to donate money and time.”

” In addition, about 76,000 people have signed a “pledge of resistance” sponsored by seven liberal advocacy groups in which they promise to risk arrest in civil disobedience if a State Department environmental analysis, expected this year, points toward approval of the pipeline.”

In other words- fighting Keystone was dumb, but it brings in money so they do it. And you gotta love the threat in the second paragraph- whatever your study shows, it must agree with our politics or else! Is there any reason to trust these activists when it comes to science and, conversely, is there any reason to trust the objectivity of a government study written under these circumstances?

So if the Pro AGW crowd completely removed the CO2 forcing capacity from their predictive models and they got exactly the temperature trends recorded in the current and historical data then we could say that CO2 has no net effect on global temperature. Alternatively if the results were still different from the observed trends then we could say they have not captured inherent and important natural (possibly cyclic) variations in temperature, in which case it would be safe to say their models in no way are capable of forecasting future global average temperature. But I think we already know that…. Right now they do not have CO2 sensitivity right nor have they captured natural cycles and forcing. Why would any rational person place any stock in the predictions made going out a hundred years.

We know that AGW has become a Left versus right political issue and because of that debate is futile. Accordingly, the only way to effectively oppose the perpetuation of global warming propaganda in both politics and the media is to — irrespective of your political persuasion — vote against the Left. Otherwise, you are helping to perpetuate the intentionally misleading intellectual dishonesty of the Left and are a part of the problem not the solution. It’s as simple as that.

If I’m understanding N-G’s point, I think I agree with his reading, but it seems like a semantic quibble. If we can be _more_ sure of AGW attribution because we all agree that the warming is much smaller than expected and we can’t think of any other cause that leads to an overall warming in the course of a century – then N-G seems to be acknowledging that the attribution is coupled with a lot less to worry about.

Pekka, I see what you’re saying, but I think in the politics of it Dr. Curry is probably right. Somehow, the one-line takeaway of many people from IPCC AR5 is “now 95% certain, instead of 90%”. I think one is within one’s rights to point out that that takeaway is very misleading. I don’t think that’s Curry’s fault. Most of us are less alarmed than we were five years ago, not more, and AR5 gives justification for that. And we are and should be less alarmed with each month of stagnation. If Dr. N-G has proof that more stagnation means more certainty of attribution, that just means that those who are not trying to mislead for political gain should be focusing on other indicators instead.

1) For rational decision making it does not matter whether it’s 90% or 95%.

2) I agree that some people involved in formulating the SPM were certainly happy, when they had the change of raising the certainty from 90% to 95%. They judged correctly that this is leads to the news that the certainty has increased in spite of the fact that this was not due to more certain knowledge but to a weakened statement.

Whether that’s the reason for picking now a longer period than before is not obvious, as it’s equally logical to keep the starting date unchanged as it’s to keep the length of the period unchanged. (Longer period adds to the estimated human contribution, but leaves the observed overall change essentially at the previous value.)

You are glossing over something that is important, Pekka. Where is the justification for picking the 90% and then the 95% story? Where are the calculations and the evidence to back up the calculations? They are picking numbers out of their butts, Pekka.

Thanks for the clarification, Pekka. I essentially agree with what you have just written, except your characterization that the debate over the IPCC’s 95% confidence con is stupid. It is not stupid to challenge them on their dubious language and tactics. They are making crap up for propaganda purposes. I believe as you do that there is reason to be concerned about the possibility of bad outcomes. The IPCC is not helping us getting any closer to an understanding of the problem.

“What’s the likelihood of really severe outcomes? (I expect that to be well less than 50%, but that’s still high when we discuss such outcomes.)
How likely it is that some specific policy decision or other specific act has a positive net effect, when risk aversion is taken into account?”
+1. That’s it, exactly. I would agree that Dr. Curry is somewhat at fault here. The issue is not whether that IPCC statement is correct, it’s that it’s the wrong question to be discussing, and it’s distracting.

”Bottom line: with the growing recognition of the importance of natural variability, it is increasingly difficult to defend the bottom bound of 51% for anthropogenic forcing, hence I find the increase in confidence to ‘extremely likely’ to be unjustified.”

”For small changes in GATA, there is no need for any external cause. Earth is never exactly in equilibrium. The motions of the massive oceans where heat is moved between deep layers and the surface provides variability on time scales from years to centuries. Examples include El Nino, the Pacific Decadal Oscillation, the Atlantic Multi-decadal Oscillation, etc. Recent work suggests that this variability is enough to account for all change in the globally averaged temperature anomaly (GATA) since the 19th century. To be sure, man’s emissions of carbon dioxide must have some impact. The question of importance, however, is how much.”

I have understood that Judith Curry agree with Richard Lindzen according to whom the impact of anthropogenic CO2 emissions is low but not yet known.

I am especially concerned for IPCC scientists who have been determined one-sidedly to focus on the scientific background of anthropogenic climate change where lack of appropriate, empiric observations has made them overspecialize in hypothetic, unworking climate model simulations. In concequence of that I understand Judith Curry’s statement: ”There is a growing discrepancy between observations and climate model projections.”

Here I have often emphasized the method of cross-disciplinary approaching, which help us adequately to focus on all potential factors influencing the multi-scientific climate problem. By analyses of them one can create principal qualifications of a synthesis for an working solution of the problem. On the basis of experience of my own, openminded researchers can learn the cross-discplinary approach of any problem by doing. If one is not openminded enough it is coincidental to reach any working solution. I am surprised that even here in this debate the cross-disciplinary approach has not been more emphasized. In addition to comments of my own I remember only one commenter who has straitforwardly proposed more cross-disciplinary research in this forum (I don’t remember who).

In addition to anthropogenic CO2 emissions to atmosphere there are plenty of other potential factors influencing the global temperature. Because IPCC has focussed on the recent warming potentially caused by anthropogenic CO2 emissions to atmosphere, I state here my own view on the real role of anthropogenic CO2 emissions in the recent global warming:

– On the basis of ice core proxies there has been proven that during glacial and interglacial periods the rise of global average CO2 content has followed long-term warming and not vice versa.

– In spite of exponential increase of CO2 content in atmosphere during the last 17 years there has not been any distinguishable, average global warming.

– The long term, average CO2 content in atmosphere is controlled mainly by sea surface temperature in the areas where sea surface CO2 sinks are. E.g. in my comment http://judithcurry.com/2011/08/04/carbon-cycle-questions/#comment-198992 I have proved the mechanism how global warming of sea surface makes the CO2 content in atmosphere rise: ”When interpreting Tisdale’s claim on Global SST anomalies and on NINO3,4 SST anomalies during 1976-2009 you can find that during the same time periode there has been no essential rising or sinking trend on the tropical sea surface temperatures. Instead, the global mean sea surface temperature has had a continuous trend of warming. What is the meaning of this? It means that the global sea surface temperatures used by Endersbee in his calculations have been controlled by warming of the sea surface waters outside the tropical sea surface i.e. mainly the warming of the sea surface waters of higher latitudes where the sea surface CO2 sinks are. As a consequence, the partial pressure of CO2 has been rising in these as sinks acting surface waters, which has been making CO2 absorption from the atmosphere to the sea surface sinks become slower. Because of that, the CO2 content in the atmosphere has been increasing. It means that more CO2 from the total CO2 emissions to the atmosphere has remained in the atmosphere to increase its CO2 content, in order to reach a new dynamic balance between CO2 emissions and absorptions. As the warming of oceans is the dominating reason for the increased content of atmospheric carbon dioxide, and as nowadays the human yearly portion ( about 8 GtC CO2) of the all yearly CO2 emissions ( little over 200 GtC CO2) to the atmosphere is about 4 %, the human role on the recent yearly increase of CO2 in the atmosphere is also about 4 %. For instance when CO2 content in the atmosphere increases 2 ppm per year, the human portion of that is only about 0.08 ppm”.

These should make anybody to understand that an increase of CO2 content in atmosphere does not control global warming; that the recent share of anthropogenic CO2 emissions have not controlled the CO2 content in atmosphere; that the influence of recent anthropogenic CO2 emissions on global warming has been ‘indistinguishable from zero'; and that therefore there is no reason for curtailment of anthropogenic CO2 emissions.

Lauri,
Interesting post. In your view rising temperature at teh high lattitude global sinks release CO2. Lots of people disagree. Lots of unknowns such as temp of the water adn partial pressure of CO2 as it sinks. Good thoughts though.
Scott

‘Precipitation over the contiguous United States exhibits large multi-decadal oscillations since the early twentieth century, and they often lead to dry (e.g., 1946–1976 and 1999-present) and wet (e.g., 1977–1998) periods and apparent precipitation trends (e.g., from the 1950s to 1990s) over most of the western and central US. The exact cause of these inter-decadal variations is not fully understood. Using observational and reanalysis data and model simulations, this paper examines the influence of the Inter-decadal Pacific Oscillation (IPO) on US precipitation. The IPO is a leading mode of sea surface temperatures (SSTs) seen mostly in the Pacific Ocean. It is found that decadal precipitation variations over much of the West and Central US, especially the Southwest, closely follow the evolution of the IPO (r = 0.85 during 1923–2010 for the Southwest US), and the dry and wet periods are associated, respectively, with the cold and warm phases of the IPO. In particular, the apparent upward trend from the 1950s–1990s and the dry decade thereafter in precipitation over much of the West and Central US are largely caused by the IPO cycles, which switched to a warm phase around 1977 and back to a cold phase around 1999.’ http://link.springer.com/article/10.1007%2Fs00382-012-1446-5

It is not just rainfall in the US – and it is not just rainfall. Hoping the cool phase will turn around any time soon is getting funnier all the time.

“If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero. The warming during part of the cycle is cancelled by cooling during the other part of the cycle. What’s left is the long-term rise caused by man.”

If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero. The warming during part of the cycle is cancelled by cooling during the other part of the cycle. What’s left is the long-term rise caused by man.

Yep. If my aunt had balls, she’d be my uncle.

So called “natural cycles” are neither regular nor repeatable. The climate system is in a kind of self-organized criticality and systems around a second order critical point exhibit scale invariant behavior in both space and time with long range correlations and power law distributions of their parameters. It means there are lots of low frequency fluctuations, which in a short enough time window may look like cyclic, but they never really are.

Therefore what is left at the end of a “cycle” is the combined effect of lower frequencies with a possible contribution from “anthropogenic forcing”, but the magnitude of the latter can’t be determined this way.

I am not sure what you just said, but the idea that climate cylces have to repeat is almost creationist in its confidence that the planet was put here for us to live on in comfort and touching in its naive simplicity.

Thank you. Excellent link. I’d been doing some light reading on and off about self-organizing dissipative structures (Prigogine) which are deeply related to self-organized criticality. Years ago my initial interest was in the origin and evolution of life which is IMO the king of emergent order.

Webby doesn’t like this of course because conservation isn’t diagnostic in open systems far from equilibrium.

This is why the “hockey stick” is a foundational piece of the whole edifice. You need the blade of the stick, sure, but you also need the flat handle if you are going to reason that any divergence from it is caused by some knowable forcing that can be input into a GCM, such as volcanoes or ACO2.

Is NG practicing Aristotelian science? I thought that method was overturned some years ago; seemingly it could lead to erroneous conclusions and a false sense of confidence .

“But here’s the thing. If, over 60 years, natural variability averages out to zero, it doesn’t matter how strong natural variability is compared to man-made climate change, what’s left over is the man-made par”

“If natural cycles are regular and repeatable, the net temperature change over one complete natural cycle will be approximately zero.”

Maybe over 4.62Kyr and ~100Kyr but not century to century. A number of cycles running in and out of phase can take many centuries to repeat.

“The smaller the rate of warming, the smaller the possibility that a separate, additional cause of warming is being missed, and that, therefore, greenhouse gases account for most or all of the total amount of warming.”

The smaller the warming, the greater the natural forcing must be, and as it is now cooling, the natural forcing must be the largest forcing.