New paper from Nic Lewis on climate sensitivity corrects ‘prior’ errors in an IPCC AR4 reference

Climate sensitivity distributions retained (and in some cases recast) by the IPCC from their assessment of the literature. Source: IPCC AR4

Nic Lewis has a new post at Climate Audit that deals with some assumptions that went into IPCC AR4’s use of a uniform prior for estimating climate sensitivity.

He has shown it to be faulty, to the point that would normally be cause for a retraction, but this is Climate Science, where being wrong is simply a shade of grey, not black, nor white.

He writes:

Frame and Allen’s original graph (Figure 1) showed that use of a uniform prior in ECS gives a very high 95% upper bound for climate sensitivity, whereas a uniform prior in Feedback strength (the reciprocal of ECS) – which declines with ECS squared – gives a low 95% bound. A uniform prior in the observable variables (AW and EHC) also gives a 95% bound under half that based on a uniform in ECS prior; using a prior that is uniform in transient climate response (TCR) rather than in AW, and is uniform in EHC, gives an almost identical PDF.

However, the Frame et al 2005 claim that high sensitivity, high heat uptake cases cannot be ruled out is incorrect: such cases would give rise to excessive ocean warming relative to the observational uncertainty range. It follows that Frame and Allen’s proposal to use a uniform in ECS prior when it is ECS that is being estimated does not in fact answer the question they posed, as to what the study tells one about ECS given no prior knowledge about it. Of course, I am not the first person to point out that Frame and Allen’s proposal to use a uniform-in-ECS prior when estimating ECS makes no sense. James Annan and Julia Hargreaves did so years ago.

…

The noninformative prior used for method 2 is shown in Figure 3. The prior is very highly peaked the in low ECS, low Kv corner, and by an ECS of 5°C is, at mid-range Kv, under one-hundredth of its peak value . What climate scientist using a Subjective Bayesian approach would choose a joint prior for ECS and Kv looking like that, or even include any prior like it if exploring sensitivity to choice of priors? Most climate scientists would claim I had chosen a ridiculous prior that ruled out a priori the possibility of ECS being high. Yet, as I show in my paper, use of this prior produces identical results to those from applying the transformation of variables formula to the PDFs for AW and EHC that were derived in Frame et al 2005, and almost the same results as using the non-Bayesian profile likelihood method.

Figure 3: Noninformative Jeffreys’ prior for inferring ECS and Kv from the (AW, EHC) likelihood. (The fitted EHC distribution is parameterised differently here than in my paper, but the shape of the prior is almost identical.)

…

Whilst my paper was under review, the Frame et al 2005 authors arranged a corrigendum to Frame et al 2005 in GRL in relation to the likelihood function error and the miscalculation of the ocean heat content change. They did not take the opportunity to withdraw what they had originally written about choice of priors, or their claim about not being able to rule out high ECS values based on 20th century observations. My paper[v] is now available in Early Online Release form, here. The final submitted manuscript is available on my own webpage, here.

.”..but this is Climate Science, where being wrong is simply a shade of grey, not black, nor white.”

Being wrong, I’d add, is simply irrelevant, certainly where the IPCC is concerned. As long as the error supports hysterical alarmism. it’s not a problem. Critics of the most risible errors imaginable (as in Himalayan glaciers) instead of being taken seriously, are called practitioners of Voodoo Science and “deniers.” It’s not about being right. It’s about the message, and the perpetuation of their own influence.

I doubt this is a case of an error that requires retraction of the original paper. This seems more like a paper that proposes an improvement on a procedure that will affect a more accurate outcome. That kind of fine-tuning happens all the time in scientific circles without the need for retraction and is indeed how science progresses.

Justification for the use of a uniform prior probability density function (PDF) is lacking in this particular application of Bayes’ theorem. While the uniform prior PDF is uninformative it can be shown that non-uniform prior PDFs of infinite number are equally uninformative. Each of the many uninformative prior PDFs generates a different posterior PDF with consequential violation of the law of non-contradiction. That’s a logical no-no.

So if surface temperatures are less sensitive to GHGs than previously thought, where does all the excess heat go? Wouldn’t most of it go into the oceans, or to melting ice, which then leads to greater ocean acidification, thermal expansion, and sea level rise? Gee, I feel so much better now that these “errors” of the IPCC have been corrected.

Adding what Pamela Gray said, I’d like to point out that merely being wrong is not grounds for retraction of a paper. Generally retractions are the result of fraud or misconduct of some sort. Occasionally there will be a retraction for a paper’s major conclusions being wrong, but generally this only happens if the conclusions are based on gross misinterpretation of available data (such as using a statistical test that any reasonable scientist knows is inappropriate for what you’re testing), or if they’re based on a horribly faulty premise (Not just an unlikely premise, but one that there is no way it could be true).

So if surface temperatures are less sensitive to GHGs than previously thought, where does all the excess heat go? Wouldn’t most of it go into the oceans, or to melting ice, which then leads to greater ocean acidification, thermal expansion, and sea level rise?

Odd.

Sea level rise is not changing, ocean temperatures are not measurably changing, Antarctic sea ice is setting all-time record high levels while Arctic sea ice is within natural variations (within +/- 2 std deviations of its recent mean levels), and ocean acidification is not occurring (the oceans are buffered, and so are not changing pH levels.)

The whole concept of a noninformative prior is a farce.
How can a prior distribution be noninformative when a choice must be made of the type of distribution and its parameters

The simplest and oldest rule for determining a non-informative prior is the principle of indifference, which assigns equal probabilities to all possibilities.[Wikipedia: uninformative priors]

The crime against objectivity committed by Fischer (and all reviewers and defenders) was in the choice of a uniform distribution where an ECS of 1 and 18 was just as probable as 2, 3, 4, but a value of 20 was NOT POSSIBLE. How can that choice be noninformative? The high end of the uniform could have been any number. How can one limit value be less informative than another? It can be, and certainly is, wrong. But Fischer and supporters hide a blatantly biased technique behind a “noninformative” label.

As Pamela Gray suggests science is a continuum where correcting errors is part of that continuum. However we are in a political climate where the “science is settled” and whether drought or flooding occurs in California it is due to the scientifically meaningless phrase “Climate Change”.

As far as retraction, it is a matter of degree; fine-tuning will not fix errors or an inappropriate premise.

Sea levels are continuing to rise due to ice melt and thermal expansion of the oceans, which requires heat. Ocean temperatures are rising (where did you get the idea that they are not? See: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/).

Also, it’s well known that Arctic sea ice is decreasing on decadal timescales; meanwhile, Antarctic sea ice may be expanding because Antarctic land ice is melting and refreezing when it reaches the ocean (since it is less dense than seawater and freezes at a higher temperature than seawater)… which adds to sea level rise, since as we all know, melting ice cubes don’t overflow your drink glass, but adding ice certainly may. :)

Sea levels are continuing to rise due to ice melt and thermal expansion of the oceans, which requires heat. Ocean temperatures are rising (where did you get the idea that they are not?

Why don’t you try some “science” and give me some calculations for that: Give me the increased volume of seawater you claim is increasing due to “ice melt” and “thermal expansion” (when ocean temperatures have not risen but less than hundreds of one degree C since the mid 1970’s, and are not rising faster today (2114) than before.) Show me how much “ice” is needed to melt to cause your claimed increase in the rise of sea levels since 1980-1990. (Sea levels are increasing at the same rate as before the 1940’s – when air temperatures were higher than today’s values, but CO2 much less.)

Rising water temperatures leads to acidification (and increases in pH are not negligible, but perhaps you don’t understand logarithmic scales).

No, rising ocean temperatures (which are as mentioned, very, very low values) do not cause an decrease in ocean pH. An imagined increase in dissolved CO2 might, but the oceans are strongly buffered, and will not change pH levels any greater than the usual day-to-night change seen every evening in every shallow water worldwide.

Also, it’s well known that Arctic sea ice is decreasing on decadal timescales; meanwhile, Antarctic sea ice may be expanding because Antarctic land ice is melting and refreezing when it reaches the ocean (since it is less dense than seawater and freezes at a higher temperature than seawater)… which adds to sea level rise, since as we all know, melting ice cubes don’t overflow your drink glass, but adding ice certainly may. :)

Try this “science” thingy, do some math for a change rather than spout your well-memorized propaganda: Arctic sea ice area is within 2 standard deviations of the recent (1970-1980) average level for this date. That means that Arctic ice levels are within the estimated mean for today’s dates recently – which is the only dates that matter in the Arctic. Last year’s ice, 2012’s sea ice, 2011, 2010 sea ice is long gone and has no more influence than Pinatubo.

Continued loss of Arctic sea ice from today’s levels from late August through March will only INCREASE heat loss from the ocean, since the little bit of increased SW radiation absorbed when the low angle sun’s rays hit an open Arctic ocean during those seven months is much more than made up by the increased LOSS of heat energy by conduction, convection, LW radiation into space, and evaporation when the Arctic is ice free rather than ice-covered.

And ice-freezing on the surface CANNOT cause a sea level rise. A Greek named Archimedes figured that out long before your so-called education began.

Let’s try that “science thingy” again: Just how much Antarctic land ice is needed to dilute the Antarctic ocean 500 and 900 kilometers AWAY from the Antarctic coast to cause an increase in sea ice coverage of 2.0 Million square kilometers of 2 meter thick ice? Come on: Give me the mass of “lost ice” required to spread out under 13 million square kilometers of “normal extents” sea ice, then continue to spread out to dilute the ocean water to freeze 2 million MORE kilometers of “excessive” Antarctic sea ice! Give me the distance from shore the edge of the Antarctic sea ice is right now (on average), and tell me how many cubic meters of water must be diluted for your fantasy to be valid.

Oh, by the way: There is no measured ice loss across the Antarctic continent, only a tiny bit on the West Peninsula near the coast. And Arctic temperatures per the DMI at 80 north this year have never even approached the “average” established since 1958: They have not been “above average even ONCE this summer (since the sun rose back near calendar day 120)! And winter air temperatures of -20 and -25 C (even if they are above average) don’t “melt” ice very well.

A uniform prior entails equal weight to all possibilities of parameter values. The argument of this piece is that this is not appropriate. If so, then how should we change the prior? Be specific! What parameters? What distributions? How? Most of the rhetoric posted here is Freshman Calculus for a Bayesian statistician and seems to default to right/wrong zero-one BS. Bottom line: one does not have to nail a prior absolutely perfectly to trust posterior conclusions to a modicum.

Sea levels are continuing to rise due to ice melt and thermal expansion of the oceans, which requires heat. Ocean temperatures are rising (where did you get the idea that they are not? See: http://www.nodc.noaa.gov/OC5/3M_HEAT_CONTENT/).

Wrong. That’s what you get for listening to the NODC. Here is a blink gif of the NODC’s “adjusting” the data. Don’t be naive when the gov’t asserts something. Be skeptical! The government has a vested interest in promoting scares. Don’t believe them. Be a skeptic.

Sea level rise is not changing, ocean temperatures are not measurably changing, Antarctic sea ice is setting all-time record high levels while Arctic sea ice is within natural variations (within +/- 2 std deviations of its recent mean levels), and ocean acidification is not occurring (the oceans are buffered, and so are not changing pH levels.)

That is correct. You quote Newsweak. They are sensational, but not scientific. You write:

Rising water temperatures leads to acidification (and increases in pH are not negligible, but perhaps you don’t understand logarithmic scales).

Next, the ocean “acidification” scare is alarmist nonsense. But don’t take my word for it. Visit this database and learn for yourself. If that is too much info for you, then read David Middleton’s excellent deconstruction of the pH scare here.

Next, you say:

…it’s well known that Arctic sea ice is decreasing on decadal timescales; meanwhile, Antarctic sea ice may be expanding because Antarctic land ice is melting and refreezing when it reaches the ocean (since it is less dense than seawater and freezes at a higher temperature than seawater)… which adds to sea level rise, since as we all know, melting ice cubes don’t overflow your drink glass, but adding ice certainly may.

Wrong again. Sea level rise has not accelerated, as documented above. The ocean heat content [OHC] is not rising fast. See the ARGO data above. Arctic ice was declining until recently, but total global ice cover is at its 30-year average. That is because the Antarctic [which is rarely mentioned by the alarmist crowd] has 10x the ice that the Arctic has, and that ice is increasing.

The climate Null Hypothesis states that nothing unusual or unprecedented is occurring, and the Null Hypothesis has never been falsified. Try to be skeptical. There are lots of self-serving connivers out there, trying to scare you for their own benefit. For the most part, they are lying.

Visit the newspaper articles and media reports around the world at the time. From British Prime Minister Gordon Brown to Australian Prime Minister Kevin Rudd … from Al Gore to prince Charles … they were all swinging from the same branch.

Well, as we have seen since AR4 was released, it was all propaganda.

AR4 was all about those computer model derived trends reflected in those charts of rising temperature trends based on various rising human activity CO2 emissions scenarios. Remember?

And those charts have now all been debunked by none other than Mother Nature, which has revealed a flat global average temperature trend for almost 18 years.

lundasoid@hotmail.com: Bottom line: one does not have to nail a prior absolutely perfectly to trust posterior conclusions to a modicum.

You seem to suggest that Frame’s exact choice of prior doesn’t really matter too much to the posterior. But Jame’s Annan’s paper shows (in table 1) that a prior of a uniform distribution on [0, 10] results in the posterior with an upper 95% limit of 6.9 while a prior of a uniform distribution on [0, 20] results in the posterior with an upper 95% limit of 12.3. That’s quite a difference. I don’t see how we can trust both to a modicum.

Further to dbstealey’s excellent comments putting facts in context, where does justsayin get the idea that Antarctic land ice is melting (see: justsayin says:July 30, 2014 at 4:57 pm)?

How can Antartic land ice be melting to any significant degree, since by definition, it is not in contact with the ocean (which is warm in relative terms) and therefore not subject to melting from below (powered by the relative warmth of the oceans), and air temperature over the continent is well below freezing? According to Wikipedia “The mean annual temperature of the interior is −57°C (−70°F). The coast is warmer. Monthly means at McMurdo Station range from −26°C (−14.8°F) in August to −3°C (26.6°F) in January.[8] At the South Pole, the highest temperature ever recorded was −12.3°C (9.9°F) on 25 December 2011.[9] ” See generally http://en.wikipedia.org/wiki/Climate_of_Antarctica

Of course, there are small areas around the coast which for relatively short periods of the year are warmer, and are above freezing, but significant melt from these areas cannot be happening since for most of the year they are below freezing. As Wikipedia observes: “Along the Antarctic Peninsula, temperatures as high as 15°C (59°F) have been recorded,[clarification needed] though the summer temperature is below 0°C (32°F) in most time.”

If the pause continues through to 2018/19, in my opinion, there will not be an AR6 since this will have to fess up and admit that the computer projections are departing too much from reality (they will by then be outside the 95% confidence level), and between now and 2018/19 it is probable that we will be seeing more and more papers suggesting ever lower figures for climate sensitivity.

Presently, the IPCC dodged the issue of sensitivity on the basis that there was no consensus. But should the pause continue through to 2018/19, it is likely that consensus will be that climate sensitivity is unlikely to be above 2 (or much above 2) and the bell curve will be showing significant area towards the 1.3 end.

And should it actually begin to cool between say 2015 and 2018/19, the data sets will show a negative straight linear trend for the entirety of this millenium (may not be significant, but negative nonetheless). The IPCC will face all sorts of problems should, if by then, the pause come have come to an end, and cooling actually begun.

There is a good chance that this farce will not survivve another 5 years. It is a question of how much damage will have been done before then. The only saving grace is that at Rio, China made it clear that it would do nothing before 2020, and the economic problems that the developed world have faced, have prevented the developed nations from rushing as quickly as they had intended towards decarbonising their economies. .

You yet again display your arrogance, ignorance and stupidity when at July 30, 2014 at 7:13 pm you write saying in total.

“The climate Null Hypothesis states that nothing unusual or unprecedented is occurring, and the Null Hypothesis has never been falsified..

1. This is not a null. There is no quantifiable statement that can be falsified
2. It is not a null that is related to the core issue. to wit, c02 causes warming

Oh dear! Those assertions are completely wrong.

This is not the first time you have had this explained to you. Another of your recent excursion into displaying your ignorance of the scientific method was when you asserted there was no scientific Null Hypothesis until the 1930s when Fisher introduced the Null Hypothesis to statistics! But, in your arrogance you refuse to learn and, instead, proclaim your stupidity.

I explain the matter again, and I hope that this time you will read it and do that rare thing for you – learn from it.

The Null Hypothesis says it must be assumed a system has not experienced a change unless there is evidence of a change.

The Null Hypothesis is a fundamental scientific principle and forms the basis of all scientific understanding, investigation and interpretation. Indeed, it is the basic principle of experimental procedure where an input to a system is altered to discern a change: if the system is not observed to respond to the alteration then it has to be assumed the system did not respond to the alteration.

In the case of climate science there is a hypothesis that increased greenhouse gases (GHGs, notably CO2) in the air will increase global temperature. There are good reasons to suppose this hypothesis may be true, but the Null Hypothesis says it must be assumed the GHG changes have no effect unless and until increased GHGs are observed to increase global temperature. That is what the scientific method decrees. It does not matter how certain some people may be that the hypothesis is right because observation of reality (i.e. empiricism) trumps all opinions.

Please note that the Null Hypothesis is a hypothesis which exists to be refuted by empirical observation. It is a rejection of the scientific method to assert that one can “choose” any subjective Null Hypothesis one likes. There is only one Null Hypothesis: i.e. it has to be assumed a system has not changed unless it is observed that the system has changed.

However, deciding a method which would discern a change may require a detailed statistical specification.

In the case of global climate no unprecedented climate behaviours are observed so the Null Hypothesis decrees that the climate system has not changed.

Importantly, an effect may be real but not overcome the Null Hypothesis because it is too trivial for the effect to be observable. Human activities have some effect on global temperature for several reasons. An example of an anthropogenic effect on global temperature is the urban heat island (UHI). Cities are warmer than the land around them, so cities cause some warming. But the temperature rise from cities is too small to be detected when averaged over the entire surface of the planet, although this global warming from cities can be estimated by measuring the warming of all cities and their areas.

Clearly, the Null Hypothesis decrees that UHI is not affecting global temperature although there are good reasons to think UHI has some effect. Similarly, it is very probable that AGW from GHG emissions are too trivial to have observable effects.

The feedbacks in the climate system are negative and, therefore, any effect of increased CO2 will be probably too small to discern because natural climate variability is much, much larger. This concurs with the empirically determined values of low climate sensitivity.

Indeed, because climate sensitivity is less than 1.0°C for a doubling of CO2 equivalent, it is physically impossible for the man-made global warming to be large enough to be detected (just as the global warming from UHI is too small to be detected). If something exists but is too small to be detected then it only has an abstract existence; it does not have a discernible existence that has effects (observation of the effects would be its detection).

To date there are no discernible effects of AGW. Hence, the Null Hypothesis decrees that AGW does not affect global climate to a discernible degree. That is the ONLY scientific conclusion possible at present.

1. This is not a null. There is no quantifiable statement that can be falsified

That sword cuts both ways, of course. We have gotten far away from normal scientific method and into the post-normal science realm. I will even go so far as to say that (proper) PNS evaluation is all we have in cases where there are large unknowns, and is not an inappropriate tool.

But climatologists tend to abuse this valuable-but-dangerous tool as badly as today’s Keynesians abuse the theories of Keynes. Comfort-zone hunch becomes “probability” before you can say “four out of five dentists recommend”.

After all that, my PNS hunch puts me in with the “97%”, which exceedingly broad definition, of course, includes the lukewarmers. But Curry, Spencer, Lindzen, N-G, Christy (and many other prominent skeptics) and and even including Anthony all fall into that category. We don’t ask “how”, we ask “how much”.

2. It is not a null that is related to the core issue. to wit, c02 causes warming

It appears to. At least the Arrhenius experiment can be reproduced in the lab. And it does roughly correlate to our admittedly flawed observation.

There does appear to be a net radiation imbalance, in spite of the huge unknowns. We are seeing a “stepladder” progression of flat and warm periods (negative IPO/PDO phase followed by positive). Skeptics point at the flat periods and alarmists point at the warming periods. But one must take the average in order to arrive near the truth.

It is only lukewarming (at ~1C/ century or even less if microsite is what we expect), since 1950, but it is what one would expect from a mild, constant upward pressure from CO2 and other anthropogenic contributions (e.g., Arctic soot).

@lundasoid@hotmail.com 7/31 at 7:35 amMJW: Let’s not take a statement about means and central tendency and extrapolate it to extreme quantiles.

The entire point of Frame et al 2005 was to justify a high ECS 95% Upper bound. Their method for doing so was to make a prior distribution that assumed an ECS of 18 was just as probable as 2. It may be a noninformative distribution, but it was not an intelligent, nor objective distribution. Nor was it an honest attempt.

Frame 2005 was a deliberate political decision to rescue an unlikely high ECS using a by hiding behind an absurd prior distribution that pretends to be objective by using a “noninformative” label.
Garbage In, Garbage Out.
Politics In, Politics Out.

evanmjones says:
July 31, 2014 at 8:29 am
1. This is not a null. There is no quantifiable statement that can be falsified

That sword cuts both ways, of course. We have gotten far away from normal scientific method and into the post-normal science realm. I will even go so far as to say that (proper) PNS evaluation is all we have in cases where there are large unknowns, and is not an inappropriate tool.
###################

huh, you are not making much sense here.
We have a very simple example; Arrhenius.
in the 1890s he made a quantifiable statement about the effect of C02
namely doubling c02 would INCREASE temperatures, not decrease them.
The Null was quantifiable.
if c02 doubles, temperatures will not change.
Since he made his prediction which has NOTHING to do with ‘unprecedented” c02 has increased and temperature has increased. We can say that over that time frame ( a century) that the null looks pretty bad. Of course that doesnt mean we know with certainty HOW MUCH warming, but we have evidence, good evidence, that a null of NO change in temperature hasnt
done very well.

The point remains that the “thesis” of natural variability as the “cause” is busted.
Its busted for two reasons.

A) natural variability is NOT a cause.. it is the effect. “something” causes the climate to vary
Saying that natural variability explains or causes natural variability is not a falsifiable
statement IN PRINCIPLE and in practice.
B) the “natural variability” “null” is not quantifiable. You cant test it.

We could say that Solar, Volcano, GHGs, Ocean cycles, GCR, land use, all drive the variability in the climate. but saying that “natural variability” explains the climate is a no op.

we could say we understand x% of the climate and y% is unexplained. we could call these
unexplained bits “natural variability” but even that would not put natural variability as an explanatory variability.

As for unprecedented. It was probably warming in the MWP than today. That does not entail that C02 doesnt warm the climate. The most it tells you that warming can be caused by the combination of many things.. the job of figuring that out is tough. But any explanation that rules out ANY role for c02 is wrong. Any explanation that assigns c02 as the one and only cause is wrong.

Seems that we are debating the properties of a quantity (ECS) for which it is not clear even exists – despite the thousands of papers, blog posts/comments and billions of dollars expended. If it was renamed as EBJS (Equilibrium Bell Jar Sensitivity (in the Mosher sense)) then there would not be much of a debate would there? Not many people except specialist in bell jar chemistry would be particularly concerned.

Seems most have accepted the premise of there is such a thing as ECS and it is positive. I don’t think that case has been made conclusively.

The current evidence indicates that the ECS (for the actual climate) is very close to zero and/or not distinguishable from noise and/or natural phenomena variability that actually drives changes in the climate (see – not a climate change denier).

Of course the best argument against the existence of an ECS seems to be an empirical one, i.e., co2 levels have been much higher in the geologic past and yet the temps seem to be non-responsive in a causal way, e.g., no tipping points, no runaway temp rise. Temps always recover despite high(er) co2 concentrations. In addition there is also evidence that co2 is responsive to temps not the other way ’round as makes much more sense physically. Or perhaps any increase (temps and/or co2) sets in motion emergent phenomenon that counteracts any increase, perhaps in some fashion similar to Eschenbach’s “climate governor” speculations.

Even assuming such a thing exists, why do the ECS probability models exclude negative numbers along the ECS axis? If using “uninformed, unbiased” priors – it seems reasonable that we might allow for an increase in co2 (at some level(s)) that would directly or indirectly cause emergent phenomena that would work to decrease temps – there is empirical evidence for this in the geologic record.

It seems that the remarkable fact concerning the climate is its remarkable stability over long periods of time. I seem to recall reading that the temps have varied much less than 1% over the recorded history.

lundasoid@hotmail.com: MJW: Let’s not take a statement about means and central tendency and extrapolate it to extreme quantiles.

That seems like an odd response considering that nothing in the comment I replied to limited its scope to means and central tendency rather than other pertinent properties of the posterior distribution; particularly when the original posting by Nic Lewis focused on the affect of the choice of priors on the 95% limits.

This argument seems ill fated for a statistician to engage in. I admit that I do not understand you all. You say that a 95% posterior quantile changes when a prior is supported uniformly over [0,10] than from uniformly over [0,20]. Well, Duh!……the two distributions do not even have the same support set! You’re comparing apples and oranges. We can’t even seem to talk about the same paper (we keep going back to Frame 2005).

The tenet of this article is that a model is wrong. Well, double duh. Every model of the atmosphere is wrong. Some models are useful (George Box), but they are all ultimately wrong.
Bayesians often include a uniform prior for feel of the problem (aka likelihood).

“mpainter says:
July 31, 2014 at 2:13 pm
Steve Mosher:
The data do not support the hypothesis of Arrhenius very well. The atmosphere is not the same as a laboratory vessel – it is quite different, in fact.

Yes they do.

Bruce A says it well in his post: Climate Sensitivity is an iffy proposition and the data indicate that it is indistinguishable from zero.

The 15uM IR band has a nice hole dug out of it from the *averaged* 255K retransmitted light but it only exists in dry and bright conditions. The scoop goes down to 220K.

So these temperatures are -20C to -50C.
Has anyone managed to heat their house to an average of 13.8C when the central heating only supplies its warming fluid to -35C? Some cry it increases air temperature by latency. That has to be empirically wrong due to the laws of thermodynamics. Any latency in the air will heat the CO2. It would be reversed!

Actually being able to measure this band requires super dry and clean air. Making it purely a theory Earth wide.

The absorption of CO2 is highly non linear as to be logarithmic. The 15uM bucket of water is taken and the horses have drank from it. Now there is nothing left. You cannot drink more from the hole. So no new heating. It’s max’ed out as proven empirically. You don’t argue with reality do you?

Then the other issue with Earth’s heating by CO2 and that is the absorption of heat by the gas includes the greater effect of cooling effect at night times.

Reblogged this on Centinel2012 and commented:
Well we know the numbers they use 1.5 C to 4.5 C with a mean of 3.0 C that they got for the 1979 Charney Report is way off. From what I can determine its under 1.0C