31 August 2009

The figure above from Reuters shows a proposed set of emissions paths for China from a recent report released by a government think tank in China (previously mentioned here). The Chinese report was heralded by some as marking a significant change of tone from the Chinese, perhaps even making a meaningful international agreement more likely.

In this post I present the assumptions of the rate of decarbonization implicit in the emissions trajectories summarized in the graphic above (I cannot locate the full 900 page report, anyone with a link, please share!). First, here are the annual rates of emissions increases implied by the scenarios above:

BAU

LOW

eLOW

2010

5.7%

2.5%

2.5%

2020

3.4%

1.8%

1.5%

2030

2.5%

1.2%

0.9%

2040

2.1%

0.9%

0.3%

2050

1.5%

0.7%

-0.6%

I want to call attention to the figures for business-as-usual, which represents a trajectory for emissions increases assuming that no additional policies are implemented beyond those in place today. Does anyone really believe that China's emissions growth will be 3.4% per year to 2020 (much less 1.5% per year to 2050)?

Here are some facts: China's emissions grew at an average rats of 12.2% per year from 2000 to 2007 (!) (data from EIA and the figure above). If China's economy grows at a rate of 6% per year, which is less that its recent growth as well as government targets for growth of 9% to 11% per year, then the assumptions are that the Chinese economy will spontaneously decarbonize by 2.6% per year to 2020 and by 4.5% per year by 2050. If China grows at its recent historical average then the implied decarbonization of the Chiese economy is 5.6% per year to 2020 and 7.5% per year to 2050. For a point of reference, the IPCC assumes a rate of spontaneous decarbonization of about 1.5% per year, a number that we criticized as being overly optimistic in our 2008 Nature paper (PDF). And remember, I'm just talking about the BAU scenario, not those requiring actual "emissions cuts."

The assumptions of spontaneous decarbonization in the Chinese emissions paths are yet another example of "magical solutions" on climate policy. With China's emissions growing at 12.2% per year during the present decade, it is inconceivable that this rate will somehow drop to 3.4% per year to 2020, much less the 1.8% or 0.9% per year implied by the low growth scenarios.

The significance of the Chinese proposal is that it indicates that China is willing to join Europe, the United States and others in a fantasyland of climate policy detached from policy reality. It is hard to believe how that outcome leads some to greater optimism on climate policy.

29 August 2009

A BBC News program with the above title will be available from this site for a few more days. Here is the precis:

The BBC's environment correspondent Richard Black investigates if climate change is diverting attention away from other environmental problems such as air pollution, acid oceans and species extinction.

Talk about climate change is everywhere, from the classroom to the UN. It is undoubtedly an important issue, but has our enthusiasm for tackling climate change led us to neglect other pressing and arguably more immediate environmental concerns, such as poor air quality in our major cities? Why has climate change attracted so much political attention and the loss of plant and animal species so little?

Far from being an 'inconvenient truth', could the climate change debate actually be rather politically convenient?

Michael Tobis, a climate scientist who can always be relied on to say what he really thinks, provides another remarkable glimpse inside modern climate science as he explains how he evaluated the arguments in Klotzbach et al., our recent paper that he continues to dismiss as "nonsense":

No scientist really knows everything he or she claims to know from direct experience. Most of what we know as individuals comes from two factors: 1) a network of trust and 2) the test of coherence. . .

This is where the first principle cuts in. Should I further investigate the key claim, still contested by the authors? Well, I know James to be an extraordinarily careful and precise thinker, and it's already demonstrated that his opposition is not. Since boundary layer meteorology is not my forte, and since the rest of the paper is flawed in many ways, I feel satisfied that it's best to put my attentions elsewhere.

The main point for present purposes is that I immediately questioned the result claimed by d'Aleo on the basis of its incoherence with everything else I know. And my questioning turned out to be justified. The publication, though it passed peer review, probably should not have done so. It looks like science from a distance, but up close it looks like nonsense.

Remarkably, Michael admits to having no expertise in the subject of our paper and not to understand the arguments made in it or about it, but because the paper challenges the beliefs that he and his tribe ("coherence network") firmly hold, he concludes that the scientific arguments in our paper must be wrong. He in fact knew this to be the case before even reading our paper. Michael speaks of an "opposition" -- and presumably these are the people in other tribes who hold views that are different than his own. I guess I am in that "opposition" because someone Michael dislikes happened to blog on our paper (d'Aleo). Michael reminds us that, after all, some other climate scientist in Michael's tribe (James Annan) who also admitted to having no expertise in the area of our paper said it must be wrong.

Will Michael's or James' critiques of our work appear in the peer reviewed literature? Of course not (because their critiques are off target and simply wrong). But their critiques do serve an important sociological function by reinforcing the tribal network and also give some comfort to those who evaluate arguments solely by their degree of conformance with views already held. Had Michael been blogging around the time of Copernicus, he would have explained to his readers that the world is in fact at the center of the universe, and that Copernicus guy must be wrong, because Michael and all of his Ptolemian friends said the world was at the center of the universe, so those saying differently must be wrong because they do not jibe with his "coherence network."

The troubling thing about this is not that people evaluate arguments based on trust -- we all do and it is a necessary part of life. What is troubling is that Michael suggests that he has found our paper "is flawed in many ways" but the basis for this is a felt need for tribal affirmation and not having his firmly held beliefs challenged. His scientific judgment on our paper is not grounded in the logic, data or analysis found in our paper which passed peer review and is published in a leading journal in the field. To the extent that Michael's views of scientific arguments are shared among his climate science peers, that community loses its ability to evaluate arguments based on their merits rather than by their putatuve tribal characteristics. I don't think that all or even most climate scientists think or act this way, but my experience is that enough do to create an unhealthy degree of politicization within the community. Michael is to be applauded for his candor, but what it reveals is not a pretty sight.

So here is an offer to Michael Tobis: Write up a serious critique of our paper's logic, data and analysis and I will publish it prominently here. Prove that our paper is in fact "nonsense" based on science and that your evaluation of it is not simply tribal politics played out on a blog.

27 August 2009

This event at CU next week is mandatory for my students. If you are local, you should come too!

The First 300 Days: An Assessment of Obama's Energy and Climate Policy

Thursday, Sept. 3, 7:00pmWolf Law BuildingWittemyer Courtroom

Free and open to the public

Directions: www.colorado.edu/law/about/visitus.htm

The Obama Administration has identified energy as one of its top policy concerns, with a focus on promoting clean, renewable energy and addressing climate change. During the first months of the new Administration we have seen a departure from the previous Administration’s policies on EPA regulation of CO2, cap and trade legislation, climate change, and renewable energy, among others. Please join the Renewable and Sustainable Energy Institute (RASEI – formerly the CU-Boulder Energy Initiative) and our panel of policy experts as we assess the accomplishments of the Obama Administration in its first 300 (or so) days.

Giant fly-swat shaped "synthetic trees" line the road into the office, where blooms of algae grow in tubes up the walls and the roof reflects heat back into the sky - all reducing the effects of global warming.

All this could be a familiar sight within the next two decades, under proposals devised by the Institution of Mechanical Engineers to alter the world's climate with new technology.

A day after John Prescott, the former Deputy Prime Minister and Environment Secretary, warned that negotiations for a global deal to cut carbon emissions were in danger of collapsing, the institution is recommending a series of technical fixes to "buy time" to avert dangerous levels of climate change.

It says that the most promising solution is offered by artificial trees, devices that collect CO2 through their "leaves" and convert it to a form that can easily be collected and stored.Tim Fox, head of environment and climate change at the institution, said that the devices were thousands of times more effective at removing carbon from the atmosphere than real trees.

In the first report on such geo-engineering by practising engineers, the institution calculates that 100,000 artificial trees - which could fit into 600ha (1,500 acres) - would be enough to capture all emissions from Britain's homes, transport and light industry. It says that five million would do the same for the whole world.

The IME report can be found here. Its recommendations are smart, and not just because they are consistent with my own related work, which you can see here:

The average damage for these storms if they all struck in 2009 is about $6 Billion per storm. However, these storms made landfall at varying strengths, from Tropical Storm to Category Three. Since Danny is projected to be a Category One storm during the time of highest impact to land, we can massage the historical damages to see how much damage the storms would cause if they were all Category One storms. From our 2008 paper in the Natural Hazards Review:

Damage from Category Two = 6x the damage from Category One

Damage from Category Three = 18x the damage from Category One

After making this adjustment, and holding constant the damage from the two Tropical Storms on the list (Alma and Esther), the average damage per storm is about $435 Million. This provides a good initial damage estimate for Danny. Please keep in mind that this is an estimate of TOTAL ECONOMIC DAMAGE, and insured damage usually runs about 50% of the total economic damage. Let’s see how this works out in the coming days.

"As chairman of the Intergovernmental Panel on Climate Change (IPCC) I cannot take a position because we do not make recommendations," said Rajendra Pachauri when asked if he supported calls to keep atmospheric carbon dioxide concentrations below 350 parts per million (ppm)."But as a human being I am fully supportive of that goal. What is happening, and what is likely to happen, convinces me that the world must be really ambitious and very determined at moving toward a 350 target"

The U.S. Chamber of Commerce, trying to ward off potentially sweeping federal emissions regulations, is pushing the Environmental Protection Agency to hold a rare public hearing on the scientific evidence for man-made climate change.

Chamber officials say it would be "the Scopes monkey trial of the 21st century" -- complete with witnesses, cross-examinations and a judge who would rule, essentially, on whether humans are warming the planet to dangerous effect.

"It would be evolution versus creationism," said William Kovacs, the chamber's senior vice president for environment, technology and regulatory affairs. "It would be the science of climate change on trial."

20 August 2009

I've had several requests for a copy of my talk yesterday given at the ANZIF Claims Conference in Sydney. I've posted a copy here in PDF (warning ~1.0 mb). Comments welcomed. Papers on which it is based can be accessed here.

Recently, I pointed to an pending decision by the Obama Administration on whether to approve a pipeline to bring carbon-intensive petroluem to the U.S. from Canadian oil sands. President Obama has approved the pipeline:

Most of the oil shipped on the line will come from Canadian oil sands producers, which have been under from some U.S. environmental groups and legislators for boosting greenhouse gas emissions because of expanding production in the oil sands -- a Florida-sized region of northern Alberta that contains the largest oil reserves outside the Middle East.

The State Department said it took greenhouse gas emissions into account when deciding to issue the permit, saying that the issue is best addressed through the domestic policies of the United States and Canada and through international agreements.

I think what Michael is saying is that there should be consequences for publishing crap.

And he and others are also intimating that you and your Dad's long term reputation might suffer if you continue palming off bullshit that impresses nobody but Rush Limbaugh and a bunch of far right teabaggers.

Seriously, Pielke, what happens if the Republicans don't retake Congress nexr year? You and Lomberg will never be called as expert witnesses again. No dough, no meeting munchies. You are betting your academic standing on the fate of a party of red-necked greasebags from the ugliest part of the deep South.

D'you guys want to be the Climate Science version of Michael Behe?

I'm not sure where Mr. Murphy gets the idea that I am a Republican (I'm not), nor do I conduct a partisan political calculus before doing my research. This gets more and more amazing.][UPDATE #1: In the comments Michael Tobis explains that the "consequences" that he wishes to see are the diminishment of professional reputation, apparently in order to "dissuade" those who seek to present views that he disagrees with and does not find politically acceptable.]

The publication of Klotzbach et al. offers an opportunity for a sort of real-world sociological experiment in the climate scientific community. Our paper builds on a large scientific literature and explores a largely unrecognized mechanism that may explain a significant (>10%) amount of the global surface temperature trends. It might prove to be correct, or it might prove wrong. We think that the arguments have merit, and so too did our peer reviewers. Science normally works by advancing hypothesis, collecting data and reporting what you find. Results are published, considered and this leads to new hypotheses and the process repeats. Along the way people debate and discuss interesting results, at conferences, at water coolers, in blogs and even in legislatures.

But this is climate science, and the rules here are a little different. Our paper enters a context where many climate scientists are also strong political advocates. This means that they judge papers not simply by their substance but by who they perceive that the results will benefit in the political process. Our paper suggests that global temperature trends are overstated if the surface trends are used to represent trends in the lower atmosphere. If true, this would likely have some relevance for climate modeling, comparisons of surface and satellite temperatures, and understanding possible measurement biases in the surface record -- that is, many topics that people like to debate. However, it does not have a lot of significance for the overall political response to climate change, as I have noted multiple times.

The paper has been received as being interesting because immediately upon its release we have been engaged in a public colliquoy with several climate scientists including Gavin Schmidt, James Annan and Michael Tobis who have tried everything possible to discredit our paper and its authors, ranging from snark, quick and dirty analyses on tangential aspects of the paper and sematic argumantation. In both science and politics such behavior is of course business-as-usual, and over the long-term science is largely self-correcting. So such behavior might be boorish but it is nothing really to complain about, as it is fairly normal and expected.

But our colloquoy with climate scientists has taken a disturbing turn as one of these scientists (Tobis) has now elevated his various complaints to a thinly veiled threat against the authors based on his views about what he sees as political implications of our paper. He writes (emphasis in original):

If there were no policy issues at stake, if the modest and dubious results of the paper weren't being egregiously overvalued and misrepresented, the scientific community could proceed in the ordinary dignified fashion of ignoring nonsense and focusing on sense. But with the standard Pielke to Watts to Morano to Inhofe play and its like, we are forced to pay attention.

Regarding getting out of this very unfortunate flavor of time sink, one thing I can imagine is to have gradations of "peer review" more complex than "published" or "unpublished". Inhofe shouldn't be waving something like this around in the senate claiming it meets the highest standards.

Even harder, but more urgent, is have a mechanism to prevent authors from promoting public misinterpretations of their publications. That sort of behavior should have consequences.

One wonders, what sort of "consequences" should I and my co-authors be subject to? After patiently enaging Tobis on his web site and here in a sincere effort to discuss our work, he decides to issue a threat? Climate science is pathological indeed. [I was unaware that Inhofe was waving our paper around as Tobis claims, can someone provide evidence of this?]

One can fully understand that if a set of collaborators who collectively have probably thousands of peer reviewed publications are met with not just derision but a threat of consequences upon publishing and discussing a peer-reviewed paper building on years of work, this will create a poisonous atmosphere for just about everyone in climate science. This is probably the point.

Let me emphasize that the vast majority of climate scientists are decent, hard-working people, and I count very many as very good friends. However the actions of a few cast a dark shadow on the entire community, and this will continue until the community starts to self-police the bad behavior of its most publicly visible voices.

Threats have no place in public discussions involving climate scientists.

19 August 2009

Most developing countries stubbornly resist western admonitions on the need to cut carbon emissions. Until recently, that included China, but signs from what is now the world’s largest emitter suggest a cultural revolution is afoot in its attitude to climate change policy.

For the first time, two senior climate change officials, Yu Qingtai and Su Wei, have left open the possibility that China will plan for an eventual peak in emissions. “Emissions will not continue to rise beyond 2050,” said Mr Su.

The FT says with an apparent straight face that:

As a quantitative measure of China’s intention to help fight climate change, the statement fails to overwhelm.

China's emissions have been increasing at around 8% per year. If China can somehow cut this rate in half and maintain it until 2050, then China's emissions in 2050 will still exceed the total global emissions in 2009.

Former US Senator Tim Wirth (D-CO) and one of the original legislators calling for action on climate change in the 1980s bravely states the obvious on the legislation now being considered in the Senate:

Cap-and-trade legislation to limit U.S. carbon dioxide emissions has “gotten out of control” and needs to be scaled back in Congress, said former Democratic Senator Timothy Wirth.

“The Republicans are right -- it’s a cap-and-tax bill,” Wirth, a climate-change negotiator during President Bill Clinton’s administration, said in an Aug. 14 interview. “That’s what it is because they are raising revenue to do all sorts of things, especially to take care of the coal industry, and it makes no sense.”

A system to cap carbon emissions and then create a market for the trading of pollution allowances is the centerpiece of President Barack Obama’s proposal to fight global warming. Wirth, who helped craft a successful emissions-trading market two decades ago that cut sulfur-dioxide pollution causing acid rain, is among Democrats questioning House-passed legislation set to be taken up next month in the Senate.

“I’m not critical of cap-and-trade,” said Wirth, head of the UN Foundation, a philanthropy established in 1998 with $1 billion from medial mogul Ted Turner. “But it has to be used in a targeted and disciplined way, and what has happened is it’s gotten out of control.”

From climate modeler James Annan, "an interesting paper which, if correct, helps to align the satellite and surface temperature trends . . . all he seems to have shown is that his preferred metric is even less useful than it at first appeared."

Hmmm . . . I haven't heard anyone yet get to this stage. Perhaps someone will argue that the observational divergence that we document is "consistent with" the behavior of the models based on some enormous spread of heretofore unrecognized uncertainties. But where might that perspective come from? I wonder . . . ;-)

. . . the growing number of presidential disaster declarations is another warning sign that global warming is already harming our people and economy.

The Weiss and Goad article is just wrong. It perpetuates a myth that is not supported by any research, but is allowed to persist by the supine mainstream scientific community and an overly credulous media. Ultimately the joke is on CAP because making arguments that are demonstrably untrue is a gift on a silver platter for opponents of cap and trade legislation. Even if the legislation were implemented and actually reduced emissions, it would have no impact on presidential disaster declarations. Presidential disaster declarations have not increased due to global warming, but rather, because of policy change and political decisions.

What is a "presidential disaster declaration"? When a disaster occurs such as an earthquake, hurricane or even a terrorist attack, the governor of a state can ask the federal government for assistance. The president can approve or decline the request, and when approved he issues a "disaster declaration." In recent decades the number of presidential disaster declarations has increased. Weiss and Goad explain that this increase is due to global warming and suggest that we can stem the increase by passing cap and trade legislation.

The Senate must promptly follow the House’s leadership by passing a clean-energy and global warming pollution reduction bill. Inaction or inadequate pollution reductions by the government would allow natural disasters in the United States to amplify in scale and frequency.

In 2001 Mary Downton and I published a paper that looked at this exact question in the context of floods.

In that paper we looked at climate, damage and politics, and guess which one was responsible for the overall increase in declarations?

Disaster relief legislation since 1950 has consistently tended to expand the scope of disaster responses available to the president. . . . . The[1988] change in FEMA’s mission may well have contributed to a spiraling increase in both requests and approvals of disaster declarations during the Clinton Administration. The risk-reduction mission creates an incentive for FEMA to recommend approval of declarations because, once a disaster is declared, FEMA can more easily influence local redevelopment planning and mitigation efforts. As more marginal events receive disaster designation, states are likely to apply for declarations in other marginal events, encouraged by seeing an increased likelihood of approval.

Interestingly we found that disaster declarations increased by 50% during years that the president was running for re-election. And no we did not identify a new atmospheric oscillation on 4-year timescales (except in the years without an incumbent on the ballot). So what about climate?

Although there is evidence of increasing precipitation in the United States, there is no evidence that this is the primary cause of the increase in disaster declarations. By invoking changes in weather, officials divert attention from the role of population growth, floodplain development, national policies, and presidential discretion in contributing to trends in federal disaster costs related to floods.

None of this should be a surprise. A recent CCSP report on extremes in the United States found no long-term trends in those phenomena that lead to most disaster declarations:

1. Over the long-term U.S. hurricane landfalls have been declining.

2. Nationwide there have been no long-term increases in drought.

3. Despite increases in some measures of precipitation, there have not been corresponding increases in peak streamflows (high flows above 90th percentile).

4. There have been no observed changes in the occurrence of tornadoes or thunderstorms

5. There have been no long-term increases in strong East Coast winter storms (ECWS), called Nor’easters.

6. There are no long-term trends in either heat waves or cold spells, though there are trends within shorter time periods in the overall record.

Scientists do indeed predict that there will be more extreme events in the future. However, to date there is no justification for attributing the increasing costs of disasters or the number of disaster declarations by the president to the emission of greenhouse gases. Convincing arguments for mitigation and adaptation policies can be made having "to find ways to exaggerate the threat."

17 August 2009

Gavin Schmidt of NASA and RealClimate kindly sent us some comments about Klotzbach et al. having to do with one part of our analysis related to the tropospheric amplification factors that we use in the paper. Below you can find our exchange (I am having some trouble getting the figures to show, but will figure it out). Schmidt's comments are welcomed as they confirm the robustness of our findings to model uncertainties. We have invited Gavin to join us as a co-author in a new short analysis presenting these additional results.

Dr. Klotzbach, I read your new paper (in press at JGR-A) with someinterest. In it you make use of the expected amplification of the MSU-LTdata over surface temperature data by a factor of about 1.25. This numbercomes from global calculations across the AR4 models reported in CCSP and,as you know, is related mainly to the expected tropical amplification ofsurface warming over the oceans.

However, I am puzzled by your claim in the paper that the sameamplification number holds for the metrics calculated over land only. Thereference for that is a personal communication from Ross McKitrick, who is(surprisingly) a source for the behaviour of the GISS model (that I run).Prof. McKitrick is not one of our collaborators (as far as I am aware) andhas no privileged access to the model output. Since MSU diagnostics werenot part of the CMIP3 archive, I would be highly surprised if he were ableto have calculated these diagnostics himself. (They are not complicated,but it does take some effort).

It is possible that he is using some supplemental data I placed online (inrelation to Schmidt, 2009;http://pubs.giss.nasa.gov/abstracts/inpress/Schmidt.html ). However, thisSAT and MSU data are a particular sub-sample of points from the GISS modeland were very restricted in scope and purpose. In fact, I do not thinkthat these data can be used to calculate the diagnostic you want.

In a transient simulation with land temperatures rising faster than theglobal mean, the moist adiabat in the tropics is tied mostly to the oceantemperatures. Noting also the fact that most of the land is not in thetropics, I would have expected the amplification to be substantially lessover land than globally.

To test this, I took the GISS-ER results from 1979-2005 (20C3M runs, fiveensemble members) and calculated the global, ocean and land averages(using the model's landmask) for the surface air temperature and thepseudo-MSU-LT diagnostics. As might be expected, the land temperaturesrise faster than the global mean or ocean values (0.26 deg C/dec vs. 0.17deg C/dec and 0.14 deg C/dec). For the annual values (as you use in yourpaper), I then calculated the expected amplification using a linearregression:

The global average amplification is indeed near 1.25, but the value overocean is significantly higher, and the value of land significantly less.Indeed, there is no expected amplification at all!

I attach two figures - one showing the transient behaviour of thesemeasures in a particular simulation, and the second emulating your figure1, but using the model diagnostics.

Possible reasons for the discrepancy with Prof. McKitrick'scomunication might lie in what land mask is being used or some issuerelated to area weighting or the sampling. However, my calculation aboveis certainly more complete and I think more relevant.

Given the potential importance of this for your paper, I thought it bestto notify you as soon as possible. If you would like to check thesecalculations on your own, please let me know and I will place the raw dataon our ftp server. If you would prefer a calculation that might be morespecifically tied to the land mask you are using for your averaging,please let me know what that is and I will update my calculationaccordingly.

Regards,

Gavin

And here is the response of Klotzbach et al.

Dear Gavin,

Thank you very much for your thorough and informative note that you sent us on Friday. We appreciate the comments.

We first note that your comments relate specifically to the amplification factors currently present in several realizations of a version of the NASA GISS model. In your comments you do not dispute our main conclusion that the there is significant disagreement between the observational satellite and surface temperature datasets, especially over land areas (which obviously is independent of uncertainties within or across models) and that sampling the temperature near the ground, as a means to estimate temperature trends through a deeper layer of the atmosphere, introduces a bias in that context. The use of a global average surface temperature trend that includes that surface data, therefore, overstates the magnitude of climate system heat changes. Your comments provide a welcome confirmation that our analysis is robust to model uncertainties.

Thanks for giving us the newly-calculated amplification factors. We have repeated all of our calculations using the amplification factors that you provided. Although it changes the magnitude of the linear trends, the statistical significance of the differences in trends is only minimally altered. The significance of the trend over land between the Hadley Centre and RSS is no longer statistically significant at the 95% level, however, all other differences between ocean and global trends are now significant using the amplification factors that you provided. The new numbers that you have given us provide additional evidence that there are issues remaining to be resolved associated with the reconciliation of the surface ocean and satellite tropospheric ocean measurements. However, as your analysis helps show, this issue goes well beyond the scope of our paper, which focused on temperatures over land.

Table 1 provides the linear trends using the amplification factors that you provided on Friday along with the original amplification factors in our “in press” paper. Figures 1-6 summarize the temperature trends and compare them with the new amplification factors, in a similar manner to the way that we made the calculations in our “in press” paper.Table 1. Global, land, and ocean per-decade temperature trends over the period from 1979-2008 for an assumed 1.25 amplification factor over the globe, an 0.95 amplification factor over land and a 1.47 amplification factor over the ocean. Included in parentheses are global, land, and ocean per-decade temperature trends over the period from 1979-2008 for an assumed 1.2 amplification factor as calculated in our “in press” paper. Differences are calculated for the NCDC surface analysis – UAH lower troposphere analysis, for the NCDC surface analysis – RSS lower troposphere analysis, for the Hadley Centre surface analysis – UAH lower troposphere analysis and for the Hadley Centre surface analysis - RSS lower troposphere analysis. Trends that are statistically significant at the 95% level are highlighted in bold face.

We have also spoken to Ross McKitrick with regards to the calculations he supplied us, using the model output that you had earlier provided. Specifically, he made his calculations from the five runs and ensemble mean that were released with your IJOC paper that you referenced in your email of Friday morning. He calculated these ratios over the 440 grid cells that were available from the period between 1979-2002 which match with the data available from the HadCRUT3 dataset (Figure 7). We accept your offer to put your raw data on an FTP site. In order to replicate your calculations we will need the monthly temperatures since 1979 for all grid cells at the surface and lower troposphere levels. Please send us the FTP address as soon as these have been posted. Thank you very much.

Figure 7: Grid cells for which data is available from CRU over the period from 1979-2002.

Also, have any other GCM groups computed similar land/ocean/entire globe amplification factors? I think a comparison of your result with that of other modeling groups would certainly be of interest to the climate community at large, especially to identify differences across models. The information that you provided reminds us that there remains a very wide range of possible observations that might be judged to be consistent with the very large range of outputs from even a single family of GCMs.

Because the analysis that you have provided represents a useful extension of our original analysis, and strongly shows that it is robust to large model uncertainties, we invite you to join us as a co-author on a short piece along the lines of this response that integrates your initial comments with the additional material presented here.

It sounds like the title of a Robert Ludlam novel or a chess strategy, but it may be a description of where US cap and trade legislation is headed. Quick on the heals of a failed vote last week on its proposed cap-and-trade legislation in the Australian Senate, the Labor government has split its renewable energy legislation from its cap and trade program:

Labor has backed down on its hard line on emissions trading and will split the legislation allowing a vote on its renewable energy target as early as this week.

The decision breaks the deadlock over the Government's 11 bills introducing an emissions trading scheme (ETS), which were defeated in Parliament last Thursday and are not due to be reintroduced until November.

With pressure for a split in the bills from the renewable energy industry, the Opposition, the Australian Greens and independent Senator Nick Xenophon have forced the Government's hand to give the industry certainty by setting the renewable energy target (RET).

The Opposition and the Government have already started negotiating amendments to the RET and industry has welcomed the backdown.

Clean Energy Council chief executive Matthew Warren said an expanded RET had overwhelming public support and, together with other energy efficiency measures, would "unleash" $28 billion of new investment and 28,000 new jobs over the next decade.

"We welcome this important step towards delivering the RET bill by the end of the week. We need the RET passed in three days, not three months," Mr Warren said.

As recently as Friday, the Government was adamant the two could not be separated because they share complementary compensation packages - even though the ETS will not begin until mid-2011 while the RET starts in January.

Climate Change Minister Penny Wong announced the change of mind today, saying it was "plan B" and a less than perfect way of dealing with the carbon pollution reduction scheme (CPRS), or ETS, and the RET.

Australia may yet again revisit it cap and trade program in the legislative process as the threat of double dissolution hangs over the heads of the opposition. However, splitting the bills likely guarantees that Australia has something positive to take to Copenhagen rather than showing up under the cloud of last week's Senate vote.

The U.S. Senate should abandon efforts to pass legislation curbing greenhouse-gas emissions this year and concentrate on a narrower bill to require use of renewable energy, four Democratic lawmakers say.

“The problem of doing both of them together is that it becomes too big of a lift,” Senator Blanche Lincoln of Arkansas said in an interview last week. “I see the cap-and-trade being a real problem.”

The resistance by Lincoln and her Senate colleagues undercuts President Barack Obama’s effort to win passage of legislation that would cap carbon dioxide emissions and establish a market for trading pollution allowances, said Peter Molinaro, the head of government affairs for Midland, Michigan- based Dow Chemical Co., which supports the measure.

“Doing these energy provisions by themselves might make it more difficult to move the cap-and-trade legislation,” said Molinaro, who is based in Washington. “In this town if you split two measures, usually the second thing never gets done.”

The House passed cap-and-trade legislation in June.

Leaders of the Democratic-controlled Senate say they are sticking with their plan to combine a version of that bill with a separate measure mandating energy efficiency and the use of renewable sources such as solar and wind power. The legislation also provides for an extension of offshore oil and gas drilling in certain areas, broadening its support.

Reid’s Comment

“I don’t think we are going to take to the Senate floor a bill stripped of climate provisions,” Senate Majority Leader Harry Reid, a Democrat from Nevada, told reporters in Las Vegas on Aug. 11.

The Senate Energy and Natural Resources Committee passed the renewable-energy legislation, 15-8, in June. Reid has set a deadline of Sept 28 for committees to complete work on climate- change provisions.

“We should separate the energy bill from the climate bill,” Conrad told reporters this month. ‘It needs to be done as soon as we can get it done,” he said, referring to the energy legislation.

For cap and trade supporters in the US, the problem with splitting the bills will be that some of the renewable energy provisions are key to getting support for the overall bill. Splitting them could very well reduce support for cap and trade. And of course, there is no threat of double dissolution, however one-and-a-third dissolution happens every two years like clockwork. The Australian gambit probably won't work for U.S. cap and trade supporters, and I'd bet those playing this card in the U.S. know that quite well.

The low damage was Donna in 1996 with $590M and the high was 1915 Galveston with $75.6B. And of course there is still the possibility that Ana could miss the US altogether. If you want to examine the historical storms in greater depth you canvisit the site hwere you can also export their tracks to Google Earth. You can also display the official NHC forecast cone as well as the tracks from individual models.

14 August 2009

The summer season still has a ways to go, but here in academia summer is just about over. That means that the pace of blog postings here will slow and it may take a bit longer for your comments to pass moderation. Thanks for understanding ;-)

UPDATE: Eric Steig writes to let me know that Prof. Huston McCulloch has withdrawn his accusation of plagiarism and has asked me to announce this here, which I am happy to do. Steig says that a test of my integrity is whether I will mention this prominently here. I offered to Steig to post his complete email to me under its own thread. Haven't heard back, but hopefully that will pass the test.

McCulloch is a bit incredulous, as he explains here. McCulloch writes that Steig indicates that "none of the 6 authors learned of the error from my post" at the time that they submitted the correction to Nature, despite the fact that McCulloch sent them an email at the time. And so there is little point in further debating this issue here, as it has left the realm of the empirical. You either believe Steig et al. or you don't, to which McCulloch says "I can only take him at his word." Accordingly, I'll close comments on this thread.

2. Their error was in the application of statistics, not basic statistical principles.

3. Publication on a blog does not count, and thus presumably, is ripe for appropriation. It is first to the peer reviewed literature that matters.

Here is the Real Climate defense in their words:

In this case, McCulloch’s comment on the paper were perfectly valid, but he chose to avoid the context of normal scientific exchange — instead posting his comments on ClimateAudit.org — and then playing a game of ‘gotcha’ by claiming plagiarism when he wasn’t cited.

McCulloch accuses Steig et al. of appropriating his ‘finding’ that Steig et al. did not account for autocorrelation when calculating the significance of trends. While the published version of the paper didn’t include such a correction, it is obvious that the authors were aware of the need to do so, since in the text of the paper it is stated that this correction was made. The corrected calculations were done using well-known methods, the details of which are available in myriad statistics textbooks and journal articles. There can therefore be no claim on Dr. McCulloch’s part of any originality either for the idea of making such a correction, nor for the methods for doing so, all of which were discussed in the original paper. Had Dr. McCulloch been the first person to make Steig et al. aware of the error in the paper, or had he written directly to Nature at any time prior to the submission of the Corrigendum, it would have been appropriate to acknowledge him and the authors would have been happy to do so. Lest there be any confusion about this, we note that, as discussed in the Corrigendum, the error has no impact on the main conclusions in the paper.

The reply to this is obvious.

1. In academia it is not who thinks of an idea first that matters. It is who publicizes it first, whether in a talk, at a conference, in a draft paper, in a published paper, or yes, on a blog. An approach based only on first to the peer reviewed literature makes a pretty sad statement about the morality of modern academia.

2. This is just a silly claim that adds words but no substance to their defense. McCulloch was not claiming originality in either statistics or the need to apply an autocorrleation adjustment, simply that the authors has done so incorrectly. This is a red herring.

3. Real Climate authors presumably wouldn't steal an idea presented at a departmental seminar, and ideas presented on popular blogs should be treated no differently. I completely reject the implication from Real Climate that stealing ideas from blogs is acceptable because they are not made "context of normal scientific exchange." Sorry guys, it 2009, and blogs are part of the "context of normal scientific exchange."

Overall, a pretty poor set of excuses for not doing the right thing in the first place.

Richard Tol has written the analysis paper (PDF) on reducing carbon dioxide emissions for the Copenhagen Consensus exercise on climate change. Here is his abstract:

The impact of climate change is rather uncertain. Available estimates suggest that the welfare loss induced by climate change in the year 2100 is in the same order as losing a few percent of income. That is, a century worth of climate change is about as bad as losing one or two years of economic growth. The impact of climate policy is better understood. A clever and gradual abatement policy can substantially reduce emissions (e.g., to stabilise greenhouse gas emissions at 650 and 550 ppm CO2eq) at an acceptable cost (1 or 2 years of growth out of 100, respectively). Very stringent targets (e.g., the 2ºC of the EU) may be very costly, however, or even infeasible. Suboptimal policy design would substantially add to the costs of emission abatement.

For the Copenhagen Consensus on Climate 2009, this paper considers five alternative policies for carbon dioxide emission reduction. The alternatives differ in scope and intensity only. All five alternatives implement a uniform carbon tax, as that is the cheapest way to reduce emissions. The first policy spends $2.5 trillion on emission reduction in the OECD before 2020. This is rather silly. The benefit-cost ratio is less than 1/100. The second policy spends $2.5 trillion across the world before 2020. This is less silly because non-OECD emission reduction is a lot cheaper, but the benefit-cost ratio is still only 1/100. The third policy continues the same intensity of climate policy between 2020 and 2100. Most negative impacts of climate change are avoided by this policy, but the costs are so large that the benefit-cost ratio is only 1/50. In the fourth policy, $2.5 trillion is invested in a trust fund to finance emission reduction over the century. The benefit-cost ratio is 1/4. In the fifth policy, the trust fund is twenty times as small. The benefit-cost ratio is 3/2. In this policy, a tax of $2/tC is imposed in 2010 on all emissions from all sources in all countries; the tax rises with the rate of discount.

As the analysis ignores uncertainty and equity, one may argue for a more stringent climate policy. However, the analysis also ignores suboptimal implementation, which argues for a more lenient climate policy.

Onno Kuik wrote one perspective (response, PDF), and here is the abstract:

This paper discusses the estimated benefit-cost ratios on mitigation as a solution to climate change. We are in agreement with most of what is written by Richard Tol on the state of the art of economic research into the impacts of climate change and climate change policies, but we highlight a complementary approach that is based on a direct elicitation of (revealed or stated) preferences for climate change. With respect to the reported benefit-cost rations, this paper argues that they are a bit low because, first, they do not reflect the substantial concerns about equity and uncertainty; and second, because a substantial part of the benefits (after the year 2100) is not accounted for.

Roberto Roson wrote the other response (PDF), and here is its abstract:

The purpose of this paper is to critically review Richard Tol’s Assessment Paper on Traditional Mitigation, prepared for the Copenhagen Consensus Centre.

The Assessment Paper is largely based on the FUND model and the results of a set of simulation exercises, where a number of policy options are explored and assessed. In this Perspective Paper, a series of limitations of the FUND model are pointed out, as well as some other points, which remain quite obscure and limit the interpretation of the results. However, when considering the simulation scenarios, it is possible to make some general remarks, which are confirmed by the model results and bring one to think that we could have got about the same findings with a different model. In other words, we can trust the results even if we do not (completely) trust the model. It is suggested that it is important to look beyond the simple assumptions used in a model like FUND, to consider more realistic settings, in which incentives may play a key role.

Not a day goes by that I read something I cannot believe has been said in the debate over global warming. It makes blogging easy, but it sure cannot help the case of climate policy making. In an interview, Nobel Prize winning economist Thomas Schelling explains to The Atlantic why politicians need to exaggerate the threat of global warming and why he hopes for massive disasters.

When asked how policies get put in place that mainly benefit people far into the future he explains that:

It's a tough sell. And probably you have to find ways to exaggerate the threat. And you can in fact find ways to make the threat serious. I think there's a significant likelihood of a kind of a runaway release of carbon and methane from permafrost, and from huge offshore deposits of methane all around the world. If you begin to get methane leaking on a large scale -- even though methane doesn't stay in the atmosphere very long -- it might warm things up fast enough that it will induce further methane release, which will warm things up more, which will release more. And that will create a huge multiplier effect, and it could become very serious.

Later Schelling is asked to clarify that comment and the reporter, Conor Clarke, expresses some sympathy with Schelling's views (bold is the reporter):

And when you say, "exaggerate the costs" do you mean, American politicians should exaggerate the costs to the American public, to get American support for a bill that will overwhelmingly benefit the developing world?

[Laughs] It's very hard to get honest people.

Well, part of me sympathizes with the case for disingenuousness! I mean, it seems to me that there is a strong moral case for helping unborn Bangladeshi citizens. But I don't know how you sell that. It's not in anyone's rational interest, at least in the US, to legislate on that basis.

That's a problem. The standard of living in the United States will almost certainly be higher in 80 years than it is now.

Schelling ends the interview expressing a wish for more disasters:

But I tend to be rather pessimistic. I sometimes wish that we could have, over the next five or ten years, a lot of horrid things happening -- you know, like tornadoes in the Midwest and so forth -- that would get people very concerned about climate change. But I don't think that's going to happen.

An analysis has been released by the University of California Berkeley's Center for the Study of Energy Markets by Christopher Knittel titled " The Implied Cost of Carbon Dioxide Under the Cash for Clunkers Program" (PDF). Here is its conclusion:

Cash for Clunkers remains an expensive way to reduce greenhouse gases even when we "credit" for criteria pollutants. Table 2 reports the results using the parameters from the base case. The implied cost of carbon is $516, $365 and $269 for three, four and five year scrappage time, respectively. If we increase the social costs of each pollutant by 50 percent, the implied cost of carbon remains above $237 per ton. This is the lower bound of the estimates in this note. . .

The Cash for Clunker program is both a stimulus and environmental program. In this note, I calculate the implied cost of greenhouse gas emission reductions and find that they exceed those estimates from the Waxman-Markey bill by nearly tenfold.

On that last point the paper says:

Another way to interpret the savings in greenhouse gases is to ask how much more fuel ecientwould the new vehicles have to be for the program to be cost effective? The CBO recently projected the allowance prices under the Waxman-Markey cap and trade program would be $28 per ton. At any reasonable scrappage rate, the cost per carbon under the CfC program exceeds this tenfold. Indeed, even if the new cars were greenhouse gas free, the clunkers would have had to have been driven 12,000 mile per year for over 20 years if not for the CfC program. This is not surprising once the simple calculations are done. At 16.3 miles per gallon, driven 12,000 miles, the clunkers consume 736.20 gallons per year, thereby emitting 7.36 tons of carbon dioxide per year. At an average CfC rebate of $4,200, the program must save 150 tons per vehicle to have an implied carbon price of $28. Once the greenhouse gases of the new vehicles are considered, the clunkers would have had to have been on the road for nearly 60 years, even with no rebound or adverse selection.

Politicians say we should invest in helping vulnerable people adapt to climate change. But how should we spend the money?

According to two UK researchers we can best help people in the developing world by funding climate modelers in the rich world:

The British Prime Minister Gordon Brown recently proposed establishing a fund of $100 billion, contributed by the wealthiest nations, to help the most vulnerable countries adapt to climate change. . . Wisely planning how the funds generated by the Prime Minister's recent proposal should be invested therefore needs good scientific guidance. In our view, this can be best achieved by climate models providing highly accurate localised predictions. As a result of the significant scientific effort to date, aided by public concern, models simulating climate change have gained considerable skill. . . There will be many scientific and technical challenges along the way, but the hope is that simulations of the global environment will be able to maximise the number of people around the world who can adapt to, and be protected from the worst impacts of, global warming.

Given the deep uncertainties involved in climate prediction (and even more so in the prediction of climate impacts) and given that climate is usually only one factor in decisions aimed at climate adaptation, we conclude that the ‘predict and provide’ approach to science in support of climate change adaptation is significantly flawed. Other areas of public policy have come up with similar conclusions (for example, earthquake risk, national security, public health). We therefore argue that the epistemological limits to climate prediction should not be interpreted as a limit to adaptation, despite the widespread belief that it is. By avoiding an approach that places climate prediction (and consequent risk assessment) at its heart, successful adaptation strategies can be developed in the face of this deep uncertainty. We suggest that decision-makers systematically examine the performance of their adaptation strategies/policies/activities over a wide range of plausible futures driven by uncertainty about the future state of climate and many other economic, political and cultural factors. They should choose a strategy that they find sufficiently robust across these alternative futures. Such an approach can identify successful adaptation strategies without accurate and precise predictions of future climate.

These findings have significant implications for science policies as well. At a time when government expects decisions to be based on the best possible science (evidence-based policy-making), we have shown that the science of climate prediction is unlikely to fulfil the expectations of decision-makers . Overprecise climate predictions can potentially lead to bad decisions if misinterpreted or used incorrectly. From a science policy perspective it is worth reflecting on where science funding agencies should focus their efforts if one of the goals is to maximize the societal benefit of science in society. The recent World Modelling Summit for Climate Prediction called for a substantial increase in computing power (an increase by a factor of 1000) in order to provide better information at the local level. We believe, however, that society will benefit much more from a greater understanding of the vulnerability of climate-influenced decisions to large irreducible uncertainties than in seeking to increase the accuracy and precision of the next generation of climate models .

13 August 2009

UPDATE: I still find this hard to believe, is it possible that Mann has mislabeled his data files such that the smoothed data appears in the annual predictions column in his data file, rather than the raw counts? I find it hard believe that it is otherwise the case.

I was curious how the curve shown in Mann et al. discussed earlier today would look using adjusted data, and thanks to Michael Mann the data is up online allowing a comparison with data adjusted according to work in 2007 by Landsea (i.e., it doesn't include the analysis from Landsea et al. released this week).

I graphed (above) the adjusted data (red curve) along with Mann et al.'s "predicted" historical data (blue curve, based on the Landsea data) both unsmoothed, just to see what it looks like -- using information from these files at Mann's directory:

I now see why Mann claims that the Landsea adjustment does not matter. And he is right, it does not matter.

The Mann et al. historical predictions range from a minimum of 9 to a maximum of 14 storms in any given year (rounding to nearest integer), with an average of 11.6 storms and a standard deviation of 1.0 storms (!). The Landsea observational record has a minimum of 4 storms and a maximum of 28 with and average of 11. 7 and a standard deviation of 3.75. I suspected that a random number generator for hurricane counts since 1870 would result in the same bottom-line results and when I appended a series of random numbers constrained between 9 and 14 from 1870-2006 to the "predicted" values, lo and behold --- 20th century values exceed every other point except about 1,000 years ago.

Mann et al.'s bottom-line results say nothing about climate or hurricanes, but what happens when you connect two time series with dramatically different statistical properties. If Michael Mann did not exist, the skeptics would have to invent him.

In 2007 Michael Mann and colleagues published a paper (PDF) critical of work suggesting an undercount in storms from historical records, claiming that it was “perilous” to assume that there is a “fixed” relationship between landfalling and total hurricanes in the Atlantic basin:

Of course, estimation of undercount based on the assumption of a fixed relationship between total TC counts and the number of landfalling storms is perilous. Such an approach assumes, in particular, that the large-scale atmospheric steering which determines the trajectories of TCs once they’ve formed is constant, when there is in fact strong evidence that it is highly variable over time . . .

Now Mann and another set of colleagues (PDF) make what appears to be the exact opposite assumption in a paper just out in Nature, that landfalls are “in rough proportion” to overall basin activity:

We compared the sediment-based record against the above statistical estimate of basin-wide tropical cyclone activity (Fig. 3), guided by a working assumption that an appropriately weighted composite of regional landfalling hurricane activity varies, at multidecadal and longer timescales, in rough proportion to basin-wide tropical cyclone activity.

What is troubling is that the analysis in the second paper depends to some degree upon the first (that is, if there is a significant undercount, which Mann dismisses, then the nature of the relationships used in the second paper changes). I note that the recent paper shows a dramatic uptick in storm activity that has been convincingly refuted by "strong evidence that there has been no systematic change in the number of north Atlantic tropical cyclones during the 20th century." It would be interesting to see Mann's analysis run with observational data properly adjusted for undercount and short-duration storms, or at a minimum considering these factors as part of the uncertainties in the analysis.

At the minimum, the two Mann et al. studies rely on highly inconsistent assumptions, yet one analysis depends upon the other. Not good.

Over at Seed Magazine I get to be the lone voice critical of the use of offsets as a means of reducing emissions in a give and take involving five perspectives. My bottom line?

So what should the role of offsets be in international and national climate policies? The answer is as small as possible. Offsetting schemes will delay decarbonization and thus should be limited as much as possible.

Head over to Seed and see if my arguments hold up against the other four experts. I don't see a way to comment there, so feel free to come back and comment here.

I've looked into how the UNFCCC (United Nations Framework Convention on Climate Change), in response to Articles 4.1(g) and 5, has addressed the issue of climate-data exchange to date, and it turns out there's more to it than I thought. Importantly, there appears to be an important dialogue between the Global Climate Observing System (GCOS) and the UNFCCC Conference of the Parties (COP). GCOS has released a series of reports that consider the issues of data exchange, management and stewardship, and make clear recommendations. There has been some follow-up by the COP, but more can be done.

Following the publication in 1998 of the GCOS report "Report on the Adequacy of the Global Climate Observing System" (PDF), COP-4, in decision 14/CP.4 urged Parties to the UNFCCC to undertake free and unrestricted exchange of data to meet the needs of the Convention.

In 2003 GCOS published its "Second Report on the Adequacy of the Global Observing Systems for Climate in Support of the UNFCCC" (PDF), which states in reference to decision 14/CP.4 that "the record of many Parties in providing full access to their data is poor. Indeed, most Parties appear to be unaware of their performance in this respect." The report contains a section on data management and stewardship, which states that "the preservation of the data for future use requires facilities and infrastructure to ensure the long-term storage of the data". One of the findings of this section is "The rapidly-increasing volume of raw observations that must be saved and stored in an archive is such that the data are too often inaccessible to many users."

In response to the second report, COP-9, in decision 11/CP.9, requested Parties to review the report and to consider what actions they can take to address the findings, noting, among other things, "the importance of adhering to applicable adopted principles of free and unrestricted exchange of data and products, especially with respect to the set of Essential Climate Variables as defined in the second adequacy report."

A year later, in 2004, GCOS submitted its "Implementation Plan for the Global Observing System for Climate in Support of the UNFCCC" to the UNFCCC Subsidiary Body for Scientific and Technological Advice (SBSTA) (PDF). It also has a section on data management and stewardship, building on the second adequacy report of 2003. A draft "Progress Report on the Implementation of the Global Observing System for Climate in Support of the UNFCCC 2004-2008" is available for comment on the GCOS website (PDF). There is no section on data management and stewardship in this report.

In 2005 GCOS submitted the report "Analysis of Data Exchange in Global Atmospheric and Hydrological Networks" (PDF). The "reluctance of some countries to exchange data", and "data and metadata standardisation and data stewardship" were among the major problems and challenges identified in the report. In response, SBSTA-23 in 2005 urged Parties and invited intergovernmental organisations and international bodies to provide active support to international data centres in their efforts to obtain permission from countries for the release of the data and the rescue of historical climate records."

The exchange and management of climate data was also discussed at an expert meeting held in the context of the Nairobi Work Programme on Impacts, Vulnerability and Adaptation to Climate Change. The report of the expert meeting (PDF) states,

"A key barrier identified in exchanging data and information, besides the fact that some data are privately held, is that the mandates of institutions holding data are not necessarily aligned with the needs of users for impacts, vulnerability and adaptation work. In this regard, WMO Resolution 40, which urges members to strengthen their commitment to the free and unrestricted exchange of meteorological and related data and products, was noted."

It also says,

"Regarding data exchange, data increase their value with use and should therefore be openly disseminated, tested, validated, documented and supported by metadata; arrangements such as the GNU General Public License (a free ‘copyleft’ licence for software and other works), which would require users to provide information on their use or modification of the data, could be explored."

Is all or any of this relevant to CRU? Yes, I think it is, in particular the response of SBSTA to the 2005 GCOS report. Presuming that CRU qualifies as an international data centre, its functioning is dependent on receiving adequate support from the UK and other Parties. However, none of the above is put in particularly strong language, and while Article 4.1(g) is an international legal commitment, the COP decisions and SBSTA report can safely be ignored by Parties. But with the publicity this debate is generating and the generally perceived increased need for climate data (not only for adaptation but also mitigation), my guess is that there will be more pressure on Parties to take Articles 4.1(g) and 5 seriously.