Fraser Institute fires off a damp squib

New addition: Download an annotated pdf of the Fraser report. An interactive pdf file, to be read on the screen, is here, and a printable version is here. Suggestions for further commenting are welcome. Additions to the pdf have to be short, and tied to particular pieces of text or figures. And of course we will only incorporate comments that we deem to be scientifically sound and cogent.

*****************

While most of the world’s climate scientists were following the IPCC fest last week, a few contrarians left out in the cold were trying to to organize their own party.

An unofficial, “Independent Summary for Policymakers” (ISPM) of the IPCC Fourth Assessment report has been delivered by the Fraser Institute. It’s a long, imposing-looking document, resembling, come to think of it, the formatting of the real Summary for Policymakers (SPM) document that was released on Friday after final negotiations of the IPCC in Paris last week. The Fraser Institute has assembled an awesome team of 10 authors, including such RC favorites as tilter-against-windmills-and-hockey-sticks Ross McKitrick, and other luminaries such as William Kininmonth, MSc, M.Admin — whose most recent paper is “Don’t be Gored into Going Along” in the Oct-Nov issue of Power Engineer. To be fair, he did publish a paper on weather forecasting, back in 1973. According to the press release, the London kickoff event will be graced by the presence of “noted environmentalist” David Bellamy. It’s true he’s “noted,” but what he’s noted for is his blatant fabrication of numbers purporting to show that the world’s glaciers are advancing rather retreating, as reported here.

Why go to all the trouble of producing an “independent” summary? The authors illuminate us with this wisdom regarding the official Summary for Policymakers: “A further problem is that the Summary for Policy Makers attached to the IPCC Report is produced, not by the scientific writers and reviewers, but by a process of negotiation among unnamed bureaucratic delegates from sponsoring governments.” This statement (charitably) shows that the Fraser Institute authors are profoundly ignorant of the IPCC process. In fact, the actual authors of the official SPM are virtually all scientists, and are publically acknowleged. Moreover, the lead authors of the individual chapters are represented in the writing process leading to the SPM, and their job is to defend the basic science in their chapters. As lead author Gerald Meehl remarked to one of us on his way to Paris: “Scientists have to be ok, they have the last check. If they think the science is not represented, then they can send it back to the breakout groups. ”

A common accusation at the time of the Third Assessment Report was that the SPM didn’t reflect the science in the rest of the report. A special National Academy panel was convened at the request of President GW Bush, to consider this and other issues. The Panel found no significant disconnect between the SPM and the body of the report. The procedure followed this time is not in essence any different from that which has been used for previous IPCC reports.

One of the strangest sections of the Fraser Institute report is the one in which the authors attempt to throw dirt on the general concept of radiative forcing. Radiative forcing is nothing more than an application of the principle of conservation of energy, looking at the way a greenhouse gas alters the energy balance of a planet. The use of energy conservation arguments of this type has been standard practice in physics at least since the time of Fourier. We have heard certain vice presidents dismiss “Energy Conservation” as merely a matter of personal virtue, but we have never before heard people who purport to be scientists write off the whole utility of “Conservation of Energy.” From what is written in the Fraser report, it is not even clear that the authors understand the first thing about how radiative transfer calculations are done. They criticize the radiative forcing concept because it “fails to take into account the lifetime of greenhouse gases” — as if we really needed to know anything more about CO2 in this regard than that it stays around for centuries to millennia. They say that radiative forcing “is computed by assuming a linear relationship between certain climatic forcing agents and particular averages of temperature data.” Nonsense. It is computed using detailed calculations of absorption and emission of infrared radiation, based on laboratory measurements carried out with exquisite accuracy, and meticulously checked against real atmospheric observations.

Hockey-stick bashing and solar-explains-all advocacy are favorite activities of the denialist camp, so it is no surprise to see both themes amply represented in the Fraser Institute report. In neither case does the Fraser report break new ground in bad behavior. It’s just more of the same old same old. On climate of the past millennium, the Fraser report misrepresents the recent National Research Council report , which concluded quite the opposite of what the Fraser report claims it concluded: The National Research Council, like the official SPM, affirms that recent warming really does appear anomalous in light of the past millennium. The Fraser report obscures this point by cleansing the recent period of warming from their graphs. The discussion of solar variability consists of a lot of vague talk about unexplored possibilities, while skirting the basic problem with solar variability as an explanation of recent warming: There is no observed trend in solar activity of a type that could explain recent warming, and if the problem were an unobserved trend in solar ultraviolet, it would make the stratosphere (where UV is absorbed by ozone) trend warmer relative to a constant-solar baseline. In reality, the stratosphere is cooling strongly, and at about the rate the models predict.

The basic approach taken by the Fraser Institute Report is to fling a lot of mud at the models and hope that at least some of it sticks. Of course, if one looks at enough details one is bound to find some areas where there is a mismatch between models and reality. Modellers do this all the time, as a way of improving the representation of physical processes. However, to highlight a few shortcomings without asking what their implications might be for climate sensitivity, or whether the mismatch might be due to data problems rather than model problems (as in the case of tropical lapse rate), gives a distorted picture of the state of the art. An examination of the model shortcomings in the light of the vast range of important things they get right leaves the fundamental premise of the cause of warming unchallenged, and to see why, one needs to turn to a balanced assessment of the science such as represented in the full IPCC report.

The Fraser Institute authors also raise the curious objection that models have not been “formally proven” to be suitable for predicting the future. We are not sure what it would mean to “formally prove” such a thing (Kurt Gödel, are you listening?), but the specific objection raised in the Fraser report makes no sense: the authors suggest that the number of tunable parameters in models is so great that it may exceed the degrees of freedom in the data being “fit.” In reality, there are at most a dozen or two parameters that modellers touch, most of these are constrained to certain limits by data, and there are physical limitations to what one can do to the output by changing such parameters. In contrast, adding up time series of temperature and precipitation and pressure as a function of latitude and longitude, seasonal cycles, surface radiation balance, ocean heat storage, ENSO events, past climates, and vertical structure, there are literally thousands of observational constraints involved in the evaluation of model behavior.

There are so many bizarre statements in the Fraser Institute report that some of us think that spotting them could serve as a good final exam in an elementary course on climate change. Take your pick. The report states that “The IPCC gives limited consideration to aerosols …” whereas aerosols have been a key part of the scenarios since the Second Assessment Report, were the key to explaining the interrupted mid-century warming, and cannot in any way be mangled so as to spuriously give the warming of the past decades. The ISPM regales us with tales of natural global warming in the distant past, without pointing out that these happened over millions of years, had often massive consequences nonetheless, and were linked to processes like continental drift which are unlikely to be part of the explanation of the recent warming. The Fraser report describes the climate changes of the past century as “minor” (a value-laden and subjective term if ever there was one), failing to realize that climate change so far has been the fire alarm, not the fire. The climate of 2100 is not forecast to be mild.

We could go on, but why bother? We’ll leave off with a quote. “most places have observed slight increases in rain and/or snow cover”

Actually, consulting the draft of Chapter 4, snow cover kinda looks likes it’s been decreasing, not increasing. But take a look at the artful use of “and/or”. The sentence is not “formally” wrong. Superb! When you hear “ISPM,” just think “Incorrect Summary for Policymakers.”

Note: In the interests of timeliness, this commentary has been based on a January 8 draft of the “ISPM” which was leaked to us. If the final released version differs substantively from what we have seen so far, the changes (for better or worse) will be discussed in the comments.

I have no idea how credible he is on climate, but he is/was a well know science communicator. He went to jail in Tasmania while protesting the proposed Franklin dam, we still have the Tassie wilderness and for his part in that outcome I thank him.

[Response: I like wilderness as much as the next guy. Whatever Bellamy did in the past, the confusion he spread about glaciers and global warming (and given the history it is difficult to see this as anything other than wilful) is hardly a good example of communicating science to the public. It’s especially sad to see people who have done some good in the world give themselves over to leading others up a garden path. I won’t presume to understand Bellamy’s motivation or psychology, but you could hardly say he makes a good poster-child for accurate scientific communication and integrity when it comes to climate change. If the glacier incident were an aberration, it’s hard to see why he’d lend credibility to the Fraser crowd by appearing at their little do in London. But I don’t know, perhaps he backed out? Does anybody know if he actually put in an appearance? After all Prime Minister Harper backed out of a Fraser Institute event when he found they were using his name to raise money by selling high-priced tickets to the event. –raypierre]

http://www.news.com.au/couriermail/story/02373921169204-2719700.html
More scepticism from Dr. Marohasy. She prefers denial and hoping for rain as the antidote for the IPCC who she thinks must be subverting their real findings for later. You know after the bill of goods is sold to the public. The plot in this really is diabolical as I portrayed it in Warm Front. The Kookaburra Consortiun lives.

…”The basic approach taken by the Fraser Institute Report is to fling a lot of mud at the models and hope that at least some of it sticks. Of course, if one looks at enough details one is bound to find some areas where there is a mismatch between models and reality. “….

Somehow they do manage to make themselves come across as ignorant in their comprehension or understanding of the concept of models and modeling.

…”McKone sees models as descriptors of the physical and chemical processes that govern the behavior or chemicals in the environment. “You can build relationships between factors that you can’t otherwise do without a model,” he says, including chemical properties, transport within and among different media, and abundance in the environment. “…”But just understanding how the pieces fit together doesn’t guarantee correct results, McKone says. “You can still get results that don’t correlate to the real thing. So models are both potentially powerful, and potentially dangerous.” While a model can hint at what interventions have the best chance of reducing pollutant concentrations and exposures, “A lot of people think models provide predictions,” McKone says, “but they don’t do this. Models are not very useful if you don’t have something with which to anchor them. You need observations to confirm the model and move it closer to a representation of reality.This is a particular problem for policymakers, who “don’t like to make choices involving uncertainty,” McKone says. “A danger is that they may just use model results to tell them what to do. “Adding more detail into a model doesn’t necessarily get you a better result if you don’t understand the basic science,” says McKone. “Model development has to be paced with the science.”…”Says McKone, “The reliability of the calculation depends on the reliability of the least well known element. If you don’t know how uncertain this weak link is, then you are making the model results look more accurate then they really are.””…”He stresses that “an important quality in a good model is called parsimony. This is defined as making the model as complicated as needed to solve a problem, but not more so. You don’t want to add details that make the model overly complicated.””…

#26 Paul : “Life presents us with an enormous number of problems. The question is, how do we prioritize these problems, and budget resources? The line you are taking here is right out of the Bjorn Lomborg playbook, and it is exactly the kind of ‘divide and conquer’ rhetorical tactic you would expect to see pushed from certain socio-political quarters.”

Paul, you referred to my previous comment (#22), but I don’t understand your point. My question (and Pascal in #7) deals with a precise point, the way we estimate forcings in the past five decades (precisely local + global and direct + indirect effects of aerosols) and we can attribute with a reasonable “likelihood” a mean global warming of 0,65 K (mainly pronounced over NH lands and past 20 yrs). Political questions are of no interest for me. I’m skeptic on some points of SPM but anyway, I rather advocate a post-fossil ernergy mix for many reasons and, as a French, I’m lucky enough to already live in a much less oil-addict society than many others.

Yes, it is sad; I think there are mixed motives – Certainly Bellamy is/was agahast at the prospect of half of Britain’s uplands being covered in wind turbines and it was this prospect that (I think) made AGW-skepticism attractive.

More generally, I often find that the more ‘skeptical’ scientists in any related field (geology especially) seem to be the over-50s. Witness the AAPG controvsy..

I wondered if you’d be so kind as to answer a quick query for me please?
In the UK we have just had data released saying that this January was the 2nd warmest January on record (The 1st warmest being January 1916). A letter denying the anthropogenic influence on climate change and trying to assert that climate scientists are somehow biased, published in my local paper yesterday, really annoyed me in light of the strong evidence of the new IPCC report.
This letter refused to believe in anthropogenic climate change by asking “how come January 1916 was even warmer then”?
I know there are other climate forcings and natural variations also at work, and that most models cannot explain the recent warming without including anthropogenic influence, but could not explain this eloquently enough to write a reply. Does anyone know, in the words of the letter “What then is the explanation for January 1916 which was warmer?”
Much thanks,
Peter

I know that RealClimate has covered the Younger Dryas and the Laurentide Ice sheet meltwater in past posts but there appears to be new research published in Paleoceanography, 21. Is that worth a new post?

Re 58, any one off year can be warm, there is natural variation all of the time but its all about the trend or the average temperatures over decades times that matters. Average nightime and daytime temperatures are increasing over time although you will find some exceptions, the trend is warmer overall hence the term global warming or AGW.

#60: That’s an old one: 1998, a letter from Dr Fred Seitz. Recycling, one might say. Following information is from exxonsecrets.org:

Dr. Seitz is a former President of the National Academy of Sciences, but the Academy disassociated itself from Seitz in 1998 when Seitz headed up a report designed to look like an NAS journal article saying that carbon dioxide poses no threat to climate. The report, which was supposedly signed by 15,000 scientists, advocated the abandonment of the Kyoto Protocol. The NAS went to unusual lengths to publically distance itself from Seitz’ article. Seitz signed the 1995 Leipzig Declaration.

On 21 April 1998, Fred Singer (SEPP) 21 April, 1998 released a statement that “More than 15,000 scientists, [8/4/98: now about 17,000] two-thirds with advanced academic degrees, have now signed a Petition against the climate accord concluded in Kyoto (Japan) in December 1997.” The petition said that “There is no convincing scientific evidence that human release of carbon dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate. Moreover, there is substantial scientific evidence that increases in atmospheric carbon dioxide produce many beneficial effects upon the natural plant and animal environments of the Earth.” Journalists and others investigating the petition, known as the Leipzig Declaration, had difficulty verifying the number and authenticity of the signatures.
Source: SEPP website

“a 15000+ scientist petition against global warming” Thats the Oregon Petiton, its old news to most of the regular readers of RealClimate. This petition is a sham. I knew about this for a while, but recently someone pointed out this, its funny:

“environmental activists successfully added the names of several fictional characters and celebrities to the list, including John Grisham, Michael J. Fox, Drs. Frank Burns, B. J. Honeycutt, and Benjamin Pierce (from the TV show M*A*S*H), an individual by the name of “Dr. Red Wine,” and Geraldine Halliwell, formerly known as pop singer Ginger Spice of the Spice Girls. Halliwell’s field of scientific specialization was listed as “biology.”http://www.sourcewatch.org/index.php?title=Oregon_Institute_of_Science_and_Medicine

Re #58: Peter: I think that the key point is in the word “global” in global warming. Go to the GISS website (http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt) to look at global average temperatures, and you will see that the global trend is much clearer than the trend for any one location (such as the UK).

(comment #61 also addresses the issue of recognizing that there is temporal variability which should be corrected for: averaging over a year is better for discerning trends than averaging over a month, or a single day. And taking the average of several years gets you clearer trends yet)

(I might guess that the anomalous UK 1916 January was due to a temporary shifting of the jet stream, for example, which would lead to a shifting of heat from one place to another rather than being any indication of a global temperature trend)

[Response: The local vs. global comment explains part of the story, but the letter-writer’s question about 1916 is a fair one in this sense: if we are emphasizing global patterns, why do we make a big thing of new regional extremes (“North America’s hottest year” “Central England’s Second Hottest”). Is it just misconceived to take regional extremes as an evidence that climate is changing? A few years ago the answer might have been “yes” but the magnitude and frequency of regional extremes is getting to the point where we can begin to say that the global warming signal is showing up even at the regional level. The thing to keep in mind is that the smaller region you deal with, the more chance there is to confuse a general trend with what you might call “climate noise,” due to just moving a small bit of hot air an unusually long distance and letting it stagnate over one place. The point is that it takes a really unusual circulation event to hit the CET temperatures of 1916, or the Dust Bowl temperatures in Chicago. As the world generally gets warmer, it gets easier and easier for blips in the circulation to make the climate hit the extreme conditions. In Chicago our summers are already starting to look like the dustbowl years, but looking at the graph (see the Union of Concerned Scientists’ Great Lakes Report), our local temperature trend is flat to up at a point where the dust bowl years were recovering toward normal. Already we have more over 97F days than in the dust bowl summers. The simple answer, then, is that there are always more fluctuations at the regional level than at the global level, but the global warming signal has gotten so strong that the climate change is starting to become clearly evident even at the regional level. Very extreme and rare circulation events in the past can nonetheless cause similar temperatures. The main point is that, as time goes on, what used to be considered a rare event in the past will become quite common. –raypierre]

Re 58: the UK is a but a very small region of the planet, and the CET but a small region of the UK. In the same way that one warm January in central England can’t be used (on its own) as evidence for global warming, nor can an earlier one be used (on its own) as evidence against.

The heat gets moved around and local temperature can be heavily influenced by the synoptics. What drives the synoptics is the question, and Raypierre’s upcoming article on the jet stream will help add one clue I think.

In reality, there are at most a dozen or two parameters that modellers touch, most of these are constrained to certain limits by data, and there are physical limitations to what one can do to the output by changing such parameters.

In other words, the model runs where the observed global mean temperature variability is satisfatorily explained when all natural causes and all human emissions (CO2 and sulphides) are taken into account – these successful runs were achieved by tuning a dozen or two parameters. How do we know that accurate projections for year 2100 would not require a different set of parameters that would lead to perhaps dramatically different results? In statistics, this would be similar to fitting in-sample and using results out-of-sample – a technique that doesn’t enjoy a great reputation.

[Response: Remember, there’s physics involved here. This is not something like an ARIMA or box-jenkins statistical model. One does have a certain ability to tell, from the comparison to the past data, whether the model is getting the mechanism right. In large measure, the question of effect on the future of different parameter choices is dealt with by the ensemble runs (e.g. ClimatePrediction.net) or by the wide variation in parameterization schemes amongst the ensemble of different models. There is nothing in the physics at present that suggests that a model which has an awful performance for the 20th century would get the year 2100 better. If you have some reason for thinking that, write it up and submit it somewhere. Otherwise, your suggestion is little better than “space aliens moved the heat around.” –raypierre]

Remember that the UK is, in terms of weather, a constant battleground between Maritime warm/wet/southwesterly weather and Continental cold/dry/easterly weather during the winter, the difference being quite drastic; so if the former persisted through January 1916 it could have been just as warm. It’s more the thing that (anecdotally) it just seems to happen every year now..

#32 – no wonder us lay sceptics are sceptical. You picked the editorial board of the institute, not the authors. The authors are (apologies just cut & pasted from google): Chief WSI/INTELLICAST Meteorologist; Consulting Meteorologist, Unionville, Canada. (Retd) Research Scientist, Environment Canada; senior administrator at the Bureau of Meteorology; professor of applied mathematics; Professor of Physical Geography and Quaternary Geology, Stockholm University; no idea, seems to work at Tartu Observatory Estonia (lucky guy) his name is all over various climate studies; an expert in meteorology and physical oceanography, who is an adjunct professor in the Department of Earth Sciences; Distinguished Professor, Meteorology & Oceanography; Professor, isotope hydrogeology and paleoclimatology. That’s all of them. Maybe the applied mathematics bloke doesn’t pass your criterion of having relevant climate knowledge. The report is littered with “not enough evidence” and “not fully understood” and “level of understanding low to very low” and so forth. We don’t understand why they have got it wrong and you have got it right; the qualifications look pretty good for all of you. They lack your certainty which surely is an attractive trait in a scientist.

“Does anyone know, in the words of the letter “What then is the explanation for January 1916 which was warmer?”
Much thanks, Peter

If this were simply the question of the variations of a single month, he might have a point. But it’s not….

The whole of Northern Europe was exceptionally warm, from the British Isles to Moscow, until the change in wind patterns restored more “normal” temperatures.
Similarly for the North Eastern United States until quite recently.

Nor is it a question of a single month. It follows a series of exceptionally warm months/seasons and years:-

Summer ’06 and Autumn ’06 were record breaking seasons in the CET data.
The July average of 19.7C was the warmest since 1659, so was September at 16.8C.
2006 was the warmest year on record for min HadCET and mean HadCET.
In terms of annual averages, try downloading the CET data from the Met office web site and then counting how many years the annual average is over 10C.
It’s quite a clear marker.

[Response: Questions like “what was the cause of x event” can sometimes be answered, though it’s often hard to pin down a single specific cause in a system like the atmosphere where everything globally affects everything else. Some very nice work along these lines has been done for N. American drought years, tracing the circulations back to sea surface temperature patterns in the tropical Pacific. The 1916 European event would be an interesting one to study in a similar vein. I myself am not aware of a study of that sort, but my knowlege of the literature on this subject is far from comprehensive. If any of the readers know of a dynamical treatment of the 1916 pattern I’d be very interested to hear about it. –raypierre]

“Does anyone know, in the words of the letter “What then is the explanation for January 1916 which was warmer?”
Much thanks, Peter

If this were simply the question of the variations of a single month, he might have a point. But it’s not….

The whole of Northern Europe was exceptionally warm, from the British Isles to Moscow, until the change in wind patterns restored more “normal” temperatures.
Similarly for the North Eastern United States until quite recently.

Nor is it a question of a single month. It follows a series of exceptionally warm months/seasons and years:-

Summer ’06 and Autumn ’06 were record breaking seasons in the CET data.
The July average of 19.7C was the warmest since 1659, so was September at 16.8C.
2006 was the warmest year on record for min HadCET and mean HadCET.
In terms of annual averages, try downloading the CET data from the Met office web site and then counting how many years the annual average is over 10C.
It’s quite a clear marker.

Ray, one thought — by 2100, the biologists expect plankton populations to crash and new species to dominate in the Antarctic waters. So a model that has some constants assumed in the physics may need some (admittedly hugely scientifically unknown) change in rates of biological cycling of carbon. This is just speculation on my part, and I may not understand how this is handled now in the models, so please correct me.

I would guess the biologists will tell you that the odds are extremely likely this will cause an excursion, but I don’t have a clue which direction or how long.

Probably only a decade or two, given the rate at which planktonic organisms reproduce, before it settles down, and I’d _guess_ the total throughput of carbon will be about the same if we’re lucky (if dying non-shelled organisms sink to the sediment the way calcite- and aragonite-shelled organisms sink now, taking the carbon out of circulation).

If on the other hand we get a huge bloom of salps, consuming whatever takes over at the bottom of the food chain primary productivity, and keeping that carbon in circulation in the upper ocean, I imagine all bets are going to be a bit skewed.
And that has already been observed:

Perhaps you can clarify for me. My understanding is that the Fraser Institute has prepared a summary of the IPCC 4AR itself. They are not introducing any new or different information. Just summarising what is already there.

Surely if this is so, can someone please explain in detail how the Fraser Institute summary seems to be so very diffferent from the SPM. Aren’t they summarising the same information?

I would have thought that a professional approach would require that the two summaries be compared on a point by point basis by reference to the 4AR to establish which is correct. I have to say that I find your piece on the issue unconvincing.

your suggestion is little better than “space aliens moved the heat around.”

I wonder if it would be considered as ad hom if I said something like that …

One does have a certain ability to tell, from the comparison to the past data, whether the model is getting the mechanism right.

I though we were discussing science here. Now, if I’m getting the message right, we are down to abilities that some people have. I’m afraid this is a shaky ground, Ray. Some very able people are known to have been wrong – we discussed Linzen only recently, for example. I hope you will forgive me if I refuse to take anybody’s word for granted. BTW, I don’t doubt that the models are getting the mechanisms right. I don’t even doubt that they are getting the sign correctly: it will be warming, not cooling. But I do doubt the magnitude. And I don’t necessarily mean a particular sign of the (likely systematic) error.

Re: ensemble runs. If you had to test a 12-dimensional parameter space then modestly trying 3 values for each would require 3^12=531,441 runs. I wonder whether we accomplished 1% of that? A propos if in reality it is two dozen parameters, then 3^24 ~ 282 billion.

I’m curious whether GCMs still use bi-harmonic parameterization of lateral viscosity. If yes, is everybody happy about it?

Contributing roughly half of the biosphere’s net primary production (NPP), photosynthesis by oceanic phytoplankton is a vital link in the cycling of carbon between living and inorganic stocks. Each day, more than a hundred million tons of carbon in the form of CO2 are fixed into organic material by these ubiquitous, microscopic plants of the upper ocean, and each day a similar amount of organic carbon is transferred into marine ecosystems by sinking and grazing.
…
Here we describe global ocean NPP changes detected from space over the past decade. The period is dominated by an initial increase in NPP of 1,930 teragrams of carbon a year (Tg C yr(-1)), followed by a prolonged decrease averaging 190 Tg C yr(-1). These trends are driven by changes occurring in the expansive stratified low-latitude oceans and are tightly coupled to coincident climate variability.

This link between the physical environment and ocean biology functions through changes in upper-ocean temperature and stratification, which influence the availability of nutrients for phytoplankton growth.

The observed reductions in ocean productivity during the recent post-1999 warming period provide insight on how future climate change can alter marine food webs.

These comments only reinforce my conclusion that the climate science debate is being dominated by left wing political advocates, who are using it to fight a proxy battle against their dreaded arc-enemy, big Oil.
Not that any of you would ever admit it.
Sad.

re: 77. Failure to understand the science (which is unequivocable), the scientific method, or the overwhelming consensus around the world by climate scientists, major climate science professional organizations, and even corporations such as “arc-enemy” BP Oil about global climate change is no excuse whatsoever to blame “political advocates”. Especially when it is the overwhelmingly “right-wing” political advocates who are desparately out to attack science. The science is there for you to read and learn and not have someone else tell you what to think or say.

Speaking of inferring things that don’t exist: it seems like you have nothing to report on your search of coal-power industry subsidies in the federal budget? And what was again your point about normal distributions?

Show that accurate predictions for 2100 would require a whole new set of parameters.

I do not have to show that. The burden of proof is not on me. If you had any background in science you’d understand it.

[Response: For a prediction to be a sound basis for policy, one does not require that it be inconceivable that the prediction be wrong. If that were the case, we would never be able to make use of foresight at all. The predictions of global warming show very likely future consequences, and are far more certain than most predictions upon which governments have routinely based decisions with massive consequences. Your argument hasn’t even gotten to the point of saying why, from a basis of physical plausibility, it is even conceivable that the forecasts are wrong in a major way, and so as far as I’m not concerned you’re not to be taken seriously. –raypierre]

Re #82: Err, thanks for your comment Dan. However, perhaps you are aware that the body of the 4th Assessment Report has not been released, and won’t be until May, so it is NOT POSSIBLE for me to read that information directly and form my own view.

I therefore have no choice but to read the SPM and the Fraser Institute Independent SPM (I have read both) which both claim to be summarising for me the SAME body of science. I can’t help but notice that there are very different emphases in the two documents, and I am asking what I hope is a reasonable question for some guidance and explanation as to how two summaries summarising the SAME material can be so different.

[Response: Well, one summary is prepared by the authors of the main text (who presumably know what they concluded), and another summary is prepared by an agenda-driven organisation who are cherry-picking sections and statements to support their pre-determined conclusions. You guess which is which. – gavin]

To all appearances the Fraser Institute event entirely died on the vine in terms of media coverage (other than a few anticipatory mentions). Google turns up nothing, and a survey of the usual British outlets turns up nothing. Even the Torygraph (publishers of Monckton)seems to be on board with the IPCC, although their related editorial on Sunday was heavily tilted toward carbon capture. I was still a bit surprised by the scope of the blackout, probably because it was so very recently that such an outcome could not be hoped for. The FI luncheon must have been a sad affair indeed. Unfortunately the entire event was closed to the public, so there may not be any witnesses who are willing to say anything in public. Even though the upshot seems clear in terms of coverage, morbid curiosity compels me to ask if anyone has details on the event.

Re the American Enterprise Institute money offered to scientists to critique the IPCC, I just read their claim that it’s to address policy only, and not the science.

I also saw on TV an AEI spokesman speak against the huge spending on the Iraqi war (or was it the CEI).

Maybe these guys really are legitimately interested in economic issues, and if that’s so, then they should be interested in all the money-saving measures to fight global warming, and be against all the subsidies & tax-breaks to fossil fuels (but that might be pushing it too far, since they are funded by fossil fuels).

RE #26 & the false dilemma about which serious world problems to address: Here’s the solution: Take all that money we save from reducing our GHGs and send it to the poor of the world who are suffering from malaria, AIDS, and other problems. I’ll do it if others do it (I wonder how much Bjorn Lomborg & his ilk are contributing to these pressing problems; maybe they should reduce their GHGs, so they’ll have some money to send). However, I’d have to convince my husband; he sort of likes seeing the savings accumulate in the bank, esp now that we’re nearing retirement age.

Re “One of the strangest sections of the Fraser Institute report is the one in which the authors attempt to throw dirt on the general concept of radiative forcing. Radiative forcing is nothing more than an application of the principle of conservation of energy, looking at the way a greenhouse gas alters the energy balance of a planet.”

Is this “conservation of energy” you refer to the first law of thermodynamics? Or something else? And if so, that really is whacky that they would dismiss or misunderstand such basics.

Your argument hasn’t even gotten to the point of saying why, from a basis of physical plausibility, it is even conceivable that the forecasts are wrong in a major way, and so as far as I’m not concerned you’re not to be taken seriously.

From a basis of physical plausibility, for starters, the models are solving equations that cannot be derived from first principles. Bi-harmonic diffusion is a numerical trick invented by the modelers to insure that the energy cascade be more plausible. The application involves additional boundary conditions that cannot be justified.

Back to the original point that I was making, the models are trained on historical data to reproduce the same data. I didn’t say that the set of parameters must change but it might. For instance, how do you know that the same quality of fit cannot be achieved by a completely different parameters set? If it can, why should it necessarily produce the same forecast?

[[These comments only reinforce my conclusion that the climate science debate is being dominated by left wing political advocates, who are using it to fight a proxy battle against their dreaded arc-enemy, big Oil.
Not that any of you would ever admit it.]]

Well, no. Admitting something that isn’t true is called ‘lying,’ and is considered a negative action.

I agree about the arc-enemies, though. I hide whenever a rainbow appears in the sky.

[[Speaking of inferring things that don’t exist: it seems like you have nothing to report on your search of coal-power industry subsidies in the federal budget?]]

You didn’t read the source I directed you to, did you?

[[ And what was again your point about normal distributions?]]

That estimates for climate sensitivity to a doubling of CO2 are probably normally distributed. If you google “normal distribution” you should pick up several little articles on it.

[[Show that accurate predictions for 2100 would require a whole new set of parameters.
I do not have to show that. The burden of proof is not on me. If you had any background in science you’d understand it.]]

Gosh, you don’t know how hard it is for folks like me with no background in science. Ignoring, that is, my degree in physics…

I need your help. I have been debating on another site about value of climate models in assisting us to understand what is happening with the climate. With Realclimate’s and Tamino’s support (thanks guys) I have been doing pretty well. But some smarty has asked me this question which is beyond my lay understanding (I suspect BS).

“Question: What is the mechanism that causes the transfer of energy from GHGs(which capture IR energy in certain bands) to non-GHGs(which are transparent to IR photons) in the atmosphere?
Sunlight striking the earth causes it to re-radiate energy mostly in the IR band. This energy is absorbed by GHGs’ molecular bonds causing vibrations in the bonds. Temperature is solely caused by translational energy, not internal energy of molecules. The absorbed IR in the GHG is then re-radiated at the same frequency and either escapes earth or is re-absorbed by another GHG molecule. How does it become translational energy that can be measured by a thermometer?”

You ask where did I get my understanding that the Fraser Institute has prepared a summary of the IPCC 4AR itself?

The first paragraph of the FI ISPM states: “The Independent Summary for Policymakers is a detailed and thorough overview of the state of climate change science as laid out in the UN Intergovernmental Panel on Climate Change (IPCC) draft report. This independent summary has been reviewed by more than 50 scientists around the world and their views on its balance and reliability are tabulated for readers. It carefully connects summary paragraphs to the chapters and sections of the IPCC report from which they are drawn, allowing readers to refer directly to what is in the IPCC Report.”

It’s true that you can train any model to reproduce any historical data trend, just as you can come up with an ‘curve-fitting’ equation for a set of points – and that curve may have no utility in prediction of what future data will look like.

However, it is extremely disingenuous to claim that climate model specialists aren’t perfectly well aware of this fact. If you actually look at the recent IPCC report, you find that it is indeed addressed:

Since IPCC’s first report in 1990, assessed projections have suggested global averaged temperature increases between about 0.15 and 0.3Â°C per decade for 1990 to 2005. This can now be compared with observed values of about 0.2Â°C per decade, strengthening confidence in near-term projections. {1.2, 3.2}

Thus, if you go back and look at the predictions from 1990, you find that they fit the observed trends – and that’s good evidence that the models are indeed performing well. The observed trends of tropospheric warming and stratospheric cooling also match decade-old model predictions.

It seems the main concern now is that the models are underestimating future changes, which has certainly proved to be the case for the ice sheet models.

This is why data collection in detail is so important – if oceanic remote sensor systems had been deployed on a widespread basis a decade ago, then the model predictions of changes in ocean circulation could have been compared to a comprehensive dataset, and the 2007 IPCC report might not include statements like:

There is insufficient evidence to determine whether trends exist in the meridional overturning circulation of the global ocean.

Sashka, how come everyone else should do tasks for you just because you do not get it? Why not present your own work here, that others can evaluate your scientific skills? Show some effort, that’s the way the science goes. Just raising questions for the sake of the questions themselves is annoying.