31 March 2011

Harold Lasswell, one of the founders of the policy movement in academia of the mid-twentieth century, once wrote that "the whole aim of the scientific student of society is to make the obvious unescapable." So it is high praise indeed to read Mark Sagoff's fine review of The Climate Fix, just out in Issues in Science and Technology, where he writes:

The great achievement of The Climate Fix is to make the obvious obvious. No small feat in these confused times.

30 March 2011

In an earlier post I made the case that one needs to know only two things about the science of climate change to begin asking whether accelerating decarbonization of the economy might be worth doing:

Carbon dioxide has an influence on the climate system.

This influence might well be negative for things many people care about.

That is it. An actual decision to accelerate decarbonization and at what rate will depend on many other things, like costs and benefits of particular actions unrelated to climate and technological alternatives. In this post I am going to further explain my views, based on an interesting question posed in that earlier thread. What would my position be if it were to be shown, hypothetically, that the global average surface temperature was not warming at all, or in fact even cooling (over any relevant time period)? Would I then change my views on the importance of decarbonizing the global energy system?

And the answer is ... no!

My concern about the potential effects of human influences on the climate system are not a function of global average warming over a long-period of time or of predictions of continued warming into the future. A point that my father often makes, and I think that he is absolutely right, is that what maters are the effects of human influences on the climate system on human and ecological scales, not at the global scale. No one experiences global average temperature and it is very poorly correlated with things that we do care about in specific places at specific times.

Consider the following thought experiment. Divide the world up into 1,000 grid boxes of equal area. Now imagine that the temperature in each of 500 of those boxes goes up by 20 degrees while the temperature in the other 500 goes down by 20 degrees. The net global change is exactly zero (because I made it so). However, the impacts would be enormous. Let's further say that the changes prescribed in my thought experiment are the direct consequence of human activity. Would we want to address those changes? Or would we say, ho hum, it all averages out globally, so no problem? The answer is obvious and is not a function of what happens at some global average scale, but what happens at human and ecological scales.

In the real world, the effects of increasing carbon dioxide on human and ecological scales are well established, and they include a biogechemical effect on land ecosystems with subsequent effects on water and climate, as well as changes to the chemistry of the oceans. Is it possible that these effects are benign? Sure. Is it also possible that these effects have some negatives? Sure. These two factors alone would be sufficient for one to begin to ask questions about the worth of decarbonizing the global energy system. But greenhouse gas emissions also have a radiative effect that, in the real world, is thought to be a net warming, all else equal and over a global scale. However, if this effect were to be a net cooling, or even, no net effect at the global scale, it would not change my views about a need to consider decarbonizing the energy system one bit. There is an effect -- or effects to be more accurate -- and these effects could be negative.

Of course, not mentioned yet is that action to improve adaptation to climate doesn't depend at all on a human influence on the climate system, warming or cooling or whatever. Adaptation makes good sense regardless. So clearly my policy views on adaptation are largely insensitive to any issues related to global average temperature change.

The debate over climate change has many people on both sides of the issue wrapped up in discussing global average temperature trends. I understand this as it is an icon with great political symbolism. It has proved a convenient political battleground, but the reality is that it should matter little to the policy case for decarbonization. What matters is that there is a human effect on the climate system and it could be negative with respect to things people care about. That is enough to begin asking whether we want to think about accelerating decarbonization of the global economy.

To fully assess whether accelerated decarbonization makes sense would require us to ask, are there any other good reasons why accelerated decarbonization might make sense? And it turns out, there are many.

29 March 2011

Over the weekend, Germany's state of Baden-Württemberg saw historic election results with the long-time ruling Christian Democratic Union party being dumped by voters after 58 years in power in favor of the newly ascendant Greens.

Conventional wisdom holds that the election's dramatic results were a consequence of the Japanese nuclear crisis and Chancellor Angela Merkel's clumsy efforts in announcing a moratorium on the nuclear plant extension that she had previously championed. I find this line of argument convincing, as well as the role played by Stuttgart-21, the controversial train station. However, not all agree.

One point is clear, with political leadership of Baden-Württemberg the Greens have inherited a difficult, so might say impossible, set of conflicting political realities. They promise a focus on continued economic growth and jobs (image above) and an shutdown of the state's nuclear power reactors.

The state is 45 percent owner of Energie Baden-Württemberg, or EnBW, which generates about half of its electricity from nuclear power plants. In its election platform, the Green party promised to shut down one plant immediately and the other in 2012. Both have been shut down temporarily because of a moratorium declared by Mrs. Merkel after the disaster in Japan.

It is unclear where the replacement power will come from, said Georg Zachmann, an energy specialist at Bruegel, a research organization in Brussels.

“In Baden-Württemberg there will be some very tough choices to be made,” Mr. Zachmann said. “The Greens now own assets that they do not want. It’s kind of a poison pill.”

"Getting rid of old nuclear plants means plants will run more coal and gas. That means around 70 million tonnes of extra carbon dioxide will be emitted and carbon is up on this prospect for now. The question is how much is this worth in terms of additional carbon price?" said Emmanuel Fages, analyst at Societe Generale/orbeo.

European Energy Commissioner Guenther Oettinger, who before his position in Brussels was Prime Minister of the state of Baden-Wuerttemberg, also supported the view that coal will act as a substitute to nuclear in Germany.

28 March 2011

The heart of The Next American Economy is nine impressively thorough case studies of “clusters” of successful US businesses, mostly making innovative products such as new types of batteries and adhesives. A veteran reporter, Holstein expended a lot of shoe leather in his researches, from Massachusetts to California, and he does an excellent job of describing what he sees and letting his subjects speak for themselves.

The point that crops up with startling regularity in their stories is the importance of government, both national and local, in helping these businesses to grow. A North Carolina technology company called Protochips, for example, pays warm tribute to the efforts of state and federal government agencies in helping it to export, including “excellent” Japanese translation.

Often, the positive contribution of government comes from the Pentagon, sometimes through its lavish spending on contracts with high-tech companies, and sometimes through its own research. As he and several of the other writers point out, the Defense Advanced Research Projects Agency laid the foundations of what became the internet.

The lesson Holstein draws is that, “Whether we like it or not, the federal government is involved in the economy, and must be.” In other words, for good or ill the government has a huge influence over the economy and it would do better to use that influence effectively in pursuit of thought-out strategic goals.

This is very well stated. The essential and unavoidable nature of government involvement in processes of innovation is a point that is often missed in policy debates. A better debate starts with the acknowledgement -- like it or not -- that government has an important role in innovation. The more interesting policy questions are what the role might be in particular contexts to help steer innovation in desirable directions.

27 March 2011

Roger Bilham, a professor of geology here at Colorado and a world expert on earthquakes, is just back from Japan and has put up a report on the earthquake. The image above, from his report, shows the planet's largest earthquakes from 1900. Of the notable 40-year gap starting in the 1960s Roger writes:

The Honshu earthquake is one of 5 earthquakes in the world to have exceeded Mw=8.4 since 2004. Initial estimates of its magnitude (Mw=8.9) have now been superceded by its Mw=9.0 status. The recent 5 mega-quakes were preceded by a four decade gap that followed a cluster of megaquakes between 1950 and 1964. No significance to the gap has been established although there appears to have been a reduction in global energy release after the 1960 cluster

Of the effects of the earthquake and tsunami on the Fushima nuclear power facility he writes:

The Fukushima Nuclear reactor successfully shut down in the Mw=9 earthquake. This must be considered a success, because shaking intensities were apparently close to Mercalli Intensity VII. Thirty minutes later a tsunami flooded the reactor buildings. In hindsight it appears impossible to believe that nuclear power stations were located on a shoreline without recognizing the engineering difficulties attending prolonged immersion by a large tsunami. In 1896 a 33 m high tsunami drowned the Sanriku coastline 200 km to the north of Fukushima. A 23 m wave surged on the same coast in 1933, and in 1993 a 30 m wave swept over Okushira Island. The Fukushima plant was protected by a 5.7 m tsunami barrier but the wave height here apparently exceeded 10 m, flooding the generators and electrical wiring in the basement and lower levels of the power plant. Nuclear power plants are simply not designed to be immersed in sea water.

A 33 m tsunami? Wow.

In his travelogue, Roger reminds us that seismologists are not like the rest of us:

I chose the 14th floor rather than the 3rd floor because being a seismologist, I wanted to really experience large aftershocks at first hand. Only the following day did I realize there was no 13th floor so I guess the 13th floor had been labeled 14. I slept through two M=5.5 events but awoke with delight to a Mw6.2 about 80 km away. Like thunder and lightning, if you count the time between the first jolt and the rolling surface waves you can gauge the distance quite well.

During the 2 am aftershock, the building heaved mightily and erratically at first, and then in the next ten seconds settled to a long swaying motion with gentle creaks of approval from the furniture. There was that uneasy feeling in the middle about whether it was going to get bigger. But no, it stopped eventually. Pretty lame sort of event in fact - no sirens, no screaming from nearby rooms. I found out later I was the sole occupant of the 13/14th floor.

Writing in Regulation (here in PDF) Jonathan Adler has a review of The Climate Fix. If I have one quibble with an otherwise fine review, it is that he ignores my discussion of obliquity in policy design when at the end he asks why we should expect policy makers to focus on what he calls "climate/energy policies." The answer of course is that they won't. Perhaps my own discussion was a bit too oblique;-)

It is a fair and thoughtful review, and I appreciate Jon taking the time to engage the analysis. Here is how Adler concludes the review:

[H]is clear-headed and non-ideological analysis is welcome in a field dominated by wild-eyed partisans and fear-mongers of various stripes. If one accepts climate change as a real threat, it is essential to acknowledge the lack of clean and easy answers. However urgent global warming may seem, policies to address it cannot be pursued to the exclusion of other concerns, including economic development and access to affordable energy sources. Understanding the depth of the challenge is not only a good place to start, it is essential for there to be any hope of success.

16 March 2011

In the face of opinion polls showing a lack of support for her proposed carbon tax, Julia Gillard today has delivered a speech that indicates that she is willing to wager her future on this issue (The speech is here in PDF). In the speech the word "carbon" appears 36 times, also appearing 36 times are the words "jobs" and "economy."

She makes clear that there is no going back:

The important thing to know is that from 1 July 2012, carbon will be priced in the Australian economy.

The journey of transformation will begin.

Friends, I chose action over inaction because of this simple truth:

If Australia does not adopt a carbon price in 2011, we probably never will.

This is the year of decision.

Action versus inaction.

Acceptance versus denial.

Setting Australia on the path to a high skill, low carbon future.

Or leaving our economy to decay into a rusting industrial museum.

That is the choice we face.

Action will protect jobs.

It is here where I think that Gillard has made a bad bet. Carbon pricing is supposed to create jobs by making fossil fuels appreciably more expensive, thereby creating a market signal that disfavors carbon-intensive industry and stimulates less carbon-intensive economic activity. The economic parts of theory seem sound enough.

However, it is the political realities that the theory does not account for. Australia's economy is very carbon intensive (PDF). Thus, if carbon pricing were to work exactly as the Prime Minister describes, it will necessary lead to a great deal of economic dislocation and change -- Consider that to meet the 5% emissions reduction target (from 2000 levels), without relying on offsets or other tricks, implies that Australia's economy would need to become as carbon efficient as Japan's by the end of this decade. How such a profoundly disruptive transitional period would be managed is the one issue that advocates of a high carbon price have never really dealt with -- the market's invisible hand will take care of it I guess.

Gillard, also skips over that part:

We cannot afford to be stranded with an outdated high-emissions economy.

We can’t freeze our economy in time, any more than we could lock ourselves behind tariff walls while the world changed outside.

I don’t want us to wake up in ten years time lumbered with a high carbon economy when the rest of the world has moved on and then scramble to catch up.

Our nation is well equipped to make the transition.

We have an abundance of natural resources like wind, natural gas, solar and geothermal.

For example, the highest average solar radiation per square metre of any continent in the world.

We have an agile, innovative business sector tempered by three decades of exposure to global competition.

We have a talented workforce ready to embrace the jobs of tomorrow. . .

Inaction will also cost jobs because emission-intensive economies will become uncompetitive in a low carbon world.

In the quest for comparative advantage, investment will flow towards those countries that can offer more output for fewer emissions.

Inaction will cost jobs.

Action will support jobs.

Friends, action on climate change means creating new jobs for the future.

It means saving and transforming existing jobs.

It means re-skilling workers for the future.

We will see new job opportunities in clean energy generation.

Electric and hybrid cars.

Manufacturing clean energy equipment.

Energy efficient construction and retro-fitting existing buildings.

Carbon capture and storage.

Today’s workers will find themselves in different industries and different settings.

Welders and steel workers will build and maintain large-scale solar power plants.

Plumbers and electricians will be reskilled to install solar hot water systems and solar panels.

How does one become "reskilled"? Without an explanation, many people will translate "reskilled" to mean "unemployed". The oft-stated idea that the proceeds of a carbon tax will be used to compensate those who fact higher costs does not address the issue of dislocation in the economy. There is a element of "magical thinking" in the idea that transforming a national economy starts with a simple decision:

. . . clean energy will open up opportunities we are only just beginning to imagine.

Those opportunities begin with that simple but momentous decision: Putting a price on carbon.

Friends, a price on carbon is the cheapest way to drive investment and jobs.

There are only two realistic outcomes here. One is that the carbon tax proposal is scrapped. With this speech it seems highly unlikely that Gillard will be the one doing any scrapping. So it would probably be via an election or a change in leadership, such as if Kevin Rudd becomes captain of the Brisbane Broncos. The second possible outcome is that the carbon pricing is watered down so far that its enactment allows Labor to claim success while limiting any actual impact from the tax on the economy. Of course, that would undercut its stated purpose -- to transform the economy.

Either way, I do not see a good outcome here for Gillard or for carbon pricing. A better strategy is the one proposed in The Climate Fix -- start with a very low carbon tax, one that is politically acceptable, and use the proceeds to invest in innovation. The carbon price would rise over time as the fruits of innovation make it politically acceptable to raise that price. I expect that Australia will soon provide (yet aonpther) lesson in how not to try to put a price on carbon.

For all the emotive force of events in Japan, though, this is one issue where there is a pressing need to listen to what our heads say about the needs of the future, as opposed to subjecting ourselves to jittery whims of the heart. One of the few solid lessons to emerge from the aged Fukushima plant is that the tendency in Britain and elsewhere to postpone politically painful choices about building new nuclear stations by extending the life-spans of existing ones is dangerous. Beyond that, with or without Fukushima, the undisputed nastiness of nuclear – the costs, the risks and the waste – still need to be carefully weighed in the balance against the different poisons pumped out by coal, which remains the chief economic alternative.

Most of the easy third ways are illusions. Energy efficiency has been improving for over 200 years, but it has worked to increase not curb demand. Off-shore wind remains so costly that market forces would simply push pollution overseas if it were taken up in a big way. A massive expansion of shale gas may yet pave the way to a plausible non-nuclear future, and it certainly warrants close examination. The fundamentals of the difficult decisions ahead, however, have not moved with the Earth.

One of the news sites in my blogroll is EurActiv.com, which I have come to trust as a reliable source of information about goings on in Brussels and Europe. Yesterday, I relied on a news story from EurActiv related to the IPCC. It turns out that just about everything in that news story was incorrect, and that news story remains posted with incorrect information.

While it is true that bloggers are often at the mercy of conventional news outlets, we can also help to quickly identify errors and help to set them straight. So I have updated the original post (leaving it live in case there are forwarding links to it). This second post is to encourage EurActiv to correct its egregiously false news story, which I am sure will be read by far more people than visit this blog.

Carbon dioxide emissions in Germany may increase by 4 percent annually in response to a moratorium on seven of the country's oldest nuclear power plants, as power generation is shifted from nuclear power, a zero carbon source, to the other carbon-intensive energy sources that currently make up the country's energy supply.

The German government announced today that it will shut down seven of the country's seventeen nuclear power plants for an indefinite period, a decision taken in response to widespread protests and a German public increasingly fearful of nuclear power after a nuclear emergency in Japan. The decision places a moratorium on a law that would extend the lifespan of these plants, and is uncharacteristic of Angela Merkel, whose government previously overturned its predecessor's decision to phase nuclear out of Germany's energy supply.

The seven plants, each built before 1980, represent 30% of Germany's nuclear electricity generation and 24% of its gross installed nuclear capacity. Shutting down these plants, or even just placing an indefinite hold on their operation, would be a major loss of zero-emissions generation capacity for Germany. The country currently relies on nuclear power from its seventeen nuclear power plants for about a quarter of its electricity supply.

Despite Japan’s crisis, India and China and some other energy-ravenous countries say they plan to keep using their nuclear power plants and building new ones.

The Japanese disaster has led some energy officials in the United States and in industrialized European nations to think twice about nuclear expansion. And if a huge release of radiation worsens the crisis, even big developing nations might reconsider their ambitious plans. But for now, while acknowledging the need for safety, they say their unmet energy needs give them little choice but to continue investing in nuclear power.

“Ours is a very power-hungry country,“ Srikumar Banerjee, the chairman of India’s Atomic Energy Commission, said during a news conference Monday in Mumbai. Nearly 40 percent of India’s 1.2 billion people do not have regular access to electricity, Mr. Banerjee said. “It is essential for us to have further electricity generation.“

And in China, which has the world’s most ambitious nuclear expansion plans, a vice minister of environment, Zhang Lijun, said on Saturday that Japan’s difficulties would not deter his nation’s nuclear rollout.

With those two countries driving the expansion — and countries from elsewhere in Asia, Eastern Europe and the Middle East also embracing nuclear power in response to high fossil fuel prices and concerns about global warming — the world’s stock of 443 nuclear reactors could more than double in the next 15 years, according to the World Nuclear Association, an industry trade group.

UPDATE: I AM INFORMED THAT THE MATERIAL REPORTED BY EURACTIV AND REPRODUCED BELOW IS COMPREHENSIVELY WRONG. APPARENTLY MR. SAWYER IS NOT A CONTRIBUTOR TO THE IPCC AND THE REPORT DOES NOT DISCUSS NUCLEAR POWER. I HAVE UPDATED THIS POST ACCORDINGLY. THE EURACTIV NEWS STORY POSTED UP YESTERDAY REMAINS IN ERROR.

. . . Steve Sawyer, who contributed a chapter to an upcoming Intergovernmental Panel on Climate Change (IPCC) special report on managing climate disasters, which will be published in May. . .

According to Sawyer, the forthcoming IPCC report will reveal that carbon emissions from nuclear power facilities clock up between 100 and 200 grams of carbon emissions per kilowatt hour (kWh). 'Clean' gas emits around 350 grams of carbon per kilowatt hour.

But wind turbines emit no carbon when producing electricity.

One life-cycle assessment of the Vestas V90-3.0MW onshore turbine – which includes the manufacture of components – found that even here, only 4.64 grams of CO2 per kWh were created.

"Nuclear power is generally the most expensive, complicated and dangerous means ever devised by human beings to boil water," Sawyer said, summing up the anti-nuclear argument.

"Why anyone would want to use it to generate electricity is beyond me, unless they were interested - as most European states were in the early days of nuclear history - in what comes out the other end, which is fissionable material for nuclear weapons," he added.

14 March 2011

The Financial Times reports that China has reassumed its position as the top manufacturing country, measured as proportion of global output.

China has become the world’s top manufacturing country by output, returning the country to the position it occupied in the early 19th century and ending the US’s 110-year run as the largest goods producer.

The change is revealed in a study released on Monday by IHS Global Insight, a US-based economics consultancy, which estimates that China last year accounted for 19.8 per cent of world manufacturing output, fractionally ahead of the US with 19.4 per cent.

That said the US remains far more productive:

Mark Killion, IHS’s head of world industry services, said, however, that the findings from the latest data were far from bleak for US manufacturing. “The US has a huge productivity advantage in that it produced only slightly less than China’s manufacturing output in 2010 but with 11.5m workers compared to the 100m employed in the same sector in China.”

Also, Mr Killion pointed out that much of China’s manufacturing output was driven by the Chinese subsidiaries of US companies and was based around US-derived technologies, especially in fields such as electronics.

Robert Engel, former IAEA inspector and Swiss nuclear engineer told Reuters Sunday that a partial meltdown of a reactor “is not a disaster” and that he doubted a complete meltdown is possible. And the details of the current Japanese reactor crisis bear little similarity to the Soviet-era meltdown at Chernobyl, which came about through design flaws and human error before it spread a radioactive cloud across much of Europe and Asia 25 years ago.

Experts at the IAEA “aren’t planning for the next Chernobyl” says a mid-level Western diplomat familiar with how the organization works. “But nor do [they] think we are out of the woods yet. The reactors are still hot. But this situation has no relation to Chernobyl, even though I realize that in the popular lore, if you say ‘Chernobyl,’ it means 'catastrophic meltdown.' ”

Key differences

The Chernobyl Soviet RBMK-1000 reactor exploded on April 26, 1986 after inexperienced handlers took the power down and then tried to power it up too quickly in an effort to discover whether a 40-second power gap in the cooling system could be bridged.

The Chernobyl reactor was new, it was undergoing tests, and it had very little structural containment measures to ward off a meltdown.

The Japanese reactors are a completely different design known as Boiling Water Reactors, which are old and tested, and have three quite elaborate systems of containment designed to constrain radioactive leakage, points out Josef Oehmen, a research scientist at the Massachusetts Institute of Technology (MIT) in Cambridge, Mass. “The third containment is designed, built, and tested for one single purpose: To contain, indefinitely, a complete core meltdown,” he writes.

Robin Grimes, director of the Centre for Nuclear Engineering at Imperial College London, told Reuters that the core of the Japanese reactors may be still intact.

"After it's all cooled down, it may well still be possible to simply remove the fuel and dispose of it in a relatively normal procedure," said Mr. Grimes. "What's clear, because of the incidental radiation being released at the moment, which is significant but not overwhelming, is that the structure of the core is probably still intact. So it's not as bad as Three Mile Island."

11 March 2011

I have an op-ed in today's NYT on the misperception, by some, that government has no business in setting technological standards in the marketplace -- It has and will, and to positive effect. Have a look and please feel free to come back here and discuss and debate.

Some of my NOAA colleagues here in Boulder have a new paper forthcoming in Geophysical Research Letters (PDF) looking at last summer's heat wave in Russia. They conclude:

Our analysis points to a primarily natural cause for the Russian heat wave. This event appears to be mainly due to internal atmospheric dynamical processes that produced and maintained an intense and long-lived blocking event. Results from prior studies suggest that it is likely that the intensity of the heat wave was further increased by regional land surface feedbacks. The absence of long-term trends in regional mean temperatures and variability together with the model results indicate that it is very unlikely that warming attributable to increasing greenhouse gas concentrations contributed substantially to the magnitude of this heat wave.

Such attribution, they warn, may only be a matter of time:

To assess this possibility for the region of western Russia, we have used the same IPCC model simulations to estimate the probability of exceeding various July temperature thresholds over the period 1880-2100 (Figure 4).The results suggest that we may be on the cusp of a period in which the probability of such events increases rapidly, due primarily to the influence of projected increases in greenhouse gas concentrations.

BOULDER—A detailed computer modeling study released today indicates that oil from the massive spill in the Gulf of Mexico might soon extend along thousands of miles of the Atlantic coast and open ocean as early as this summer. The modeling results are captured in a series of dramatic animations produced by the National Center for Atmospheric Research (NCAR) and collaborators.

The research was supported in part by the National Science Foundation, NCAR’s sponsor. The results were reviewed by scientists at NCAR and elsewhere, although not yet submitted for peer-review publication.

“I’ve had a lot of people ask me, ‘Will the oil reach Florida?’” says NCAR scientist Synte Peacock, who worked on the study. “Actually, our best knowledge says the scope of this environmental disaster is likely to reach far beyond Florida, with impacts that have yet to be understood.”

The computer simulations indicate that, once the oil in the uppermost ocean has become entrained in the Gulf of Mexico’s fast-moving Loop Current, it is likely to reach Florida's Atlantic coast within weeks. It can then move north as far as about Cape Hatteras, North Carolina, with the Gulf Stream, before turning east. Whether the oil will be a thin film on the surface or mostly subsurface due to mixing in the uppermost region of the ocean is not known.

During last year’s crisis involving the massive release of oil into the Gulf of Mexico, NCAR issued a much-watched animation projecting that the oil could reach the Atlantic Ocean. But detectable amounts of oil never made it to the Atlantic, at least not in an easily visible form on the ocean surface. Not surprisingly, we’ve heard from a few people asking whether NCAR got it wrong.

These events serve as a healthy reminder of a couple of things:

*the difference between a projection and an actual forecast
*the challenges of making short-term projections of natural processes that can act chaotically, such as ocean currents

What then went wrong?

First, the projection. Scientists from NCAR, the Department of Energy’s Los Alamos National Laboratory, and IFM-GEOMAR in Germany did not make a forecast of where the oil would go. Instead, they issued a projection. While there’s not always a clear distinction between the two, forecasts generally look only days or hours into the future and are built mostly on known elements (such as the current amount of humidity in the atmosphere). Projections tend to look further into the future and deal with a higher number of uncertainties (such as the rate at which oil degrades in open waters and the often chaotic movements of ocean currents).

Aware of the uncertainties, the scientific team projected the likely path of the spill with a computer model of a liquid dye. They used dye rather than actual oil, which undergoes bacterial breakdown, because a reliable method to simulate that breakdown was not available. As it turned out, the oil in the Gulf broke down quickly due to exceptionally strong bacterial action and, to some extent, the use of chemical dispersants.

Second, the challenges of short-term behavior. The Gulf's Loop Current acts as a conveyor belt, moving from the Yucatan through the Florida Straits into the Atlantic. Usually, the current curves northward near the Louisiana and Mississippi coasts—a configuration that would have put it on track to pick up the oil and transport it into open ocean. However, the current’s short-term movements over a few weeks or even months are chaotic and impossible to predict. Sometimes small eddies, or mini-currents, peel off, shifting the position and strength of the main current.

To determine the threat to the Atlantic, the research team studied averages of the Loop Current’s past behavior in order to simulate its likely course after the spill and ran several dozen computer simulations under various scenarios. Fortunately for the East Coast, the Loop Current did not behave in its usual fashion but instead remained farther south than usual, which kept it far from the Louisiana and Mississippi coast during the crucial few months before the oil degraded and/or was dispersed with chemical treatments.

The Loop Current typically goes into a southern configuration about every 6 to 19 months, although it rarely remains there for very long. NCAR scientist Synte Peacock, who worked on the projection, explains that part of the reason the current is unpredictable is “no two cycles of the Loop Current are ever exactly the same." She adds that the cycles are influenced by such variables as how large the eddy is, where the current detaches and moves south, and how long it takes for the current to reform.

Computer models can simulate the currents realistically, she adds. But they cannot predict when the currents will change over to a new cycle.

The scientists were careful to explain that their simulations were a suite of possible trajectories demonstrating what was likely to happen, but not a definitive forecast of what would happen. They reiterated that point in a peer-reviewed study on the simulations that appeared last August in Environmental Research Letters.

So who was at fault? According to Hosansky it was those dummies in the media:

These caveats, however, got lost in much of the resulting media coverage.

Another perspective is that having some of these caveats in the press release might have been a good idea.

09 March 2011

KGNU, Boulder's public radio station, is airing an extended interview with me tomorrow morning at 9AM (mountain time) discussing The Climate Fix. We spoke yesterday for more than 30 minutes, from which they have to distill to 21 minutes. You can listen to it when broadcast tomorrow here. I'll post up the archive link when available.

08 March 2011

Barcelona deserved to go through. The second yellow on Van Persie was a joke indeed, but irrelevant to the outcome. Hard to believe that Nicklas Bendtner could have sent Arsenal through but mishandled Wilshire's fine effort. Here is a summary of the domination from Dirty Tackle:

Barcelona beat Arsenal 20-0 in shots, 76%-24% in possession (according to Opta) and 3-1 in actual goals over their Champions League second leg match to advance on a 4-3 aggregate score. Cesc Febragas launched half-joking theories about his allegiances when he set up Lionel Messi's superb first goal (above). Sergio Busquets showed Arsenal how to score against his own team with an own goal in the 53rd minute.

My father is testifying before the House Energy & Committee today in what will inevitably be a show hearing using climate scientists as props. I don't expect much new or interesting to result from the hearing -- climate policy will remain unaltered and climate science will remain excessively politicized.

Having read the testimony of the various witnesses, I did note a stark contrast in how Richard Somerville presented the role of science and policy and that presented by my father. Here is what Somerville says (PDF):

[T]he need to drastically reduce global greenhouse gas emissions is urgent, and the urgency is scientific, not political.

Mother Nature herself thus imposes a timescale on when emissions need to peak and then begin to decline rapidly. This urgency is therefore not ideological at all, but rather is due to the physics and biogeochemistry of the climate system itself. Diplomats and legislators, as well as heads of state worldwide, are powerless to alter the laws of nature and must face scientific facts and the hard evidence of scientific findings.

Decisions about government regulation are ultimately legal, administrative, legislative, and political decisions. As such they can be informed by scientific considerations, but they are not determined by them. In my testimony, I seek to share my perspectives on the science of climate based on my work in this field over the past four decades.

The differences among the witnesses on climate science or climate policy may be less important than how they view the role of science, advocacy and democracy.

07 March 2011

[UPDATE 3/9: ClimateWire , having still not contacted me directly, has appended a correction to their article, addressing one of the three mistakes, leaving two unaddressed. Still not good. Here is the correction:

Correction: An earlier version of this article included a quote from Pielke about "egregious errors" in reference to the IPCC's findings on the melting rate of Himalayan glaciers; however, Pielke was referring instead to the IPCC's assertion of a relationship between climate change and rising costs of natural disasters.]

[UPDATE 3/8: More than 24 hours pass and not even an acknowledgement of my email to the reporter and two editors. Not good.]

I talk to people in the media a lot, and occasionally I am quoted, almost always correctly. ClimateWire has a story today from a reporter who I did not talk to and whose reporting is not so good. Here is the letter I just sent to the ClimateWire journalist:

Dear Debra-

Your article today contains several major errors in its reporting of the WSJ conference last week.

1. I did not say that the IPCC Himalayan glacier error was "egregious". I used that term to refer to the IPCC inclusion of a graph on disaster costs and climate change.

2. I did not say or imply (nor do I believe) that the glacier error or UEA emails "cast a shadow on the entire body of research showing evidence of anthropogenic climate change." I did say that the institutions of climate science were poorly prepared for dealing with the allegations of error.

3. Chris Field and I are not "frequent sparring partners." We have discussed climate issues together publicly only once before.

I spent the bulk of the time on the panel discussing the IPCC's treatment of the science of disasters and climate change and the institutional maturity of the climate science community. I find it remarkable that you ignored those issues.

That said, I am requesting that you correct the two serious misquotations of my remarks and the mischaracterization of my relationship with Chris Field. If you choose to contest this I am sure that the WSJ tape from the event can set the record straight.

Have you ever wondered how NASA estimates the costs of space flight programs? Courtesy of Glen Butts and Ken Linton (PDF), here is how it was done for the Apollo program:

[T]he original cost [estimate] was 1.5 billion with completion targeted in 1965. The "actual" historical events went something like this. The NASA cost estimating gurus in 1961 projected an amount close to $7 Billion to do the entire program.33 34 This figure was apparently padded to $10-$12 Billion by management prior to giving that estimate to James Webb, the NASA Administrator. Mr. Webb (within hours of receiving the $10-$12 Billion figure) placed an "administrator's discount" on NASA’s ability to predict costs with due precision and by the stroke of his own pen, changed the estimate to $20 billion and submitted it to Vice President Linden B. Johnson. In the words of Robert Seamans Jr., (the Associate Administrator at the time) "We were aghast!"35 This cavalier beginning describes how Apollo's original fiscal requirements arrived at the steps of the Capitol and was subsequently blessed by Congress.

Ironically, the $20 billion amount submitted by Mr. Webb to the Vice president appeared to be a completely arbitrary and highly irregular move. In anyone's book it was a radical cost estimating maneuver to be sure. But in the end, Mr. Webb's innate business sense and the courage to follow what that sense told him validated his action. It turned out to be a leadership demonstration of profound foresight. In the end the "real cost" of Apollo ultimately surpassed Mr. Webb's $20 billion estimate with a price tag of $25.4 billion as was reported to congress in 1973. The final program cost varies depending on what we include or exclude in the calculations,36 37 38 but in all instances exceeds $20 billion.

Today we are submitting a very short analysis of the total costs of the Space Shuttle program, 1971-2011 -- stay tuned.

03 March 2011

When certain information proves challenging to entrenched political or ideological commitments it can be easy for policy makers to ignore, downplay or even dismiss that information. It is a common dynamic and knows no political boundaries. Global Dashboard catches the Obama Administration selectively explaining the causes for increasing world food prices:

[O]n one aspect of US food policy, there’s a deafening silence: the government’s support for corn-based ethanol. Here’s Robert Hormats, Under-Secretary of State for Economic, Energy and Agricultural Affairs, on USAID’s blog:

World food prices have been increasing over the past six months, due to weather-related production losses and strong global demand. The growing demand is fueled by rapid expansion of middle-class households in emerging markets.

Er, hello? Let’s stop by at FAO’s Food Price Index (February data out today – you guessed it, another record high). What do they think is driving cereal prices upwards?

The increase in February mostly reflected further gains in international maize prices, driven by strong demand amid tightening supplies, while prices rose marginally in the case of wheat and fell slightly in the case of rice.

In other words, this is mainly about corn. And who’s the biggest corn exporter in the world? The United States.

And where is 40% of US corn production going this year? Ethanol, for use in US car engines.

And will USAID acknowledge that this has anything at all to do with spiking food prices? Don’t hold your breath.

I am at the WSJ ECO:nomics conference this week, where tomorrow I'll be on a panel with Chris Field of the IPCC. During the opening panel last night Louis Chênevert, CEO of United Technologies, provided another telling anecdote about the massive scale of China's growth. The world elevator market is about 500,000 units per year, of which, 280,000 are installed in China and only 16,000 in the United States (this article has some details).

On the panel, Chênevert, Mark Pinto (of Applied Materials) and Zhenrong Shi (of Suntech Power Holdings) explained and agreed that, in their business areas, companies are relocating to China not because of labor costs, but rather because of scale and subsidies. Given this, my sense is that in may contexts the US cannot and should not try to compete. Think of elevators as a good example.

02 March 2011

In a report Tuesday, economists at the Pew Center on the States and the Nelson A. Rockefeller Institute of Government conclude that states increasingly overestimate their tax revenue during tough economic times. The report's authors calculate that during the depth of the latest recession in 2009, income forecasts by all 50 states overshot reality by a total $49 billion.

There is a pretty obvious fix here -- the forecasters should continue doing what they have been doing, and decision makers need just apply a bias correction based on the historical performance of the forecasts. In this case the bias correction could be a function of relevant economic variables that are correlated with past forecast performance. Such a bias correction would improve the skill of the revenue forecasts.

Roger Pielke of University of Colorado, Boulder and Michael Levi from the Council of Foreign Affairs have been questioning the sufficiency of China’s efforts for years. Analysis of EIA data lends quantitative credence to their position.

Check out the article by a climate survivalist from the February 27, 2011 Washington Post. (I’m going to go out on a limb and treat the article as if it’s not a satire or hoax, but maybe the joke’s on me.) The author describes how he’s buying solar panels and generators and laying in food and supplies and putting extra locks on his doors and windows in anticipation of the coming climate apocalypse, much in the way that in the 1960s certain nuts were digging shelters in their backyard to provide protection against hydrogen bombs, and in the ‘80s (and probably to this day) right-wing crazies were building up small arsenals to protect themselves against the time when the government tried to take away their right to be bigots.

Anyway, fear of the coming apocalypse seems to be an honorable tradition among some factions of the human race, and besides in this case it’s probably good for the beleaguered economy that this guy is spending what must be lots of money on hardware, both high-tech and low. But there are some elements of climate survivalism that are truly troubling. The fact that the Washington Post chose to put this article on the front page of its Sunday opinion section is an editorial judgment that the author, who is executive director of the Chesapeake Climate Action Committee, is someone whose perspective deserves to be taken seriously.

Sarewitz notices an obvious irony:

One can hardly fail to note the contrast between the standard, communitarian rhetoric of climate change advocacy on behalf of getting rid of fossil fuels – we all need to act together to save the Earth! – and the nihilistic isolationism of climate survivalism – I need to put bars on my windows to save my butt! After all, one of the big arguments that environmentalists have used about the need to stop climate change is that those who will suffer most are the little brown poor people in far-off lands who will, for instance, experience increased incidence of malaria and exposure to floods and other disasters. (Of course the fact that they are already burdened by such things in huge disproportion to the privileged minority doesn’t seem to enter into the argument). Why this hasn’t been a justification for aggressive adaptation I fail to understand (after all, the reason why the privileged minority are relatively insulated from such suffering is precisely that our societies are better adapted to many types of stresses). But I raise this point because when it comes to climate survivalism, the little brown folks are nowhere to be seen, and apparently it’s every relatively affluent white guy (and his nuclear family, of course) for himself.

While China gains accolades for its targets and results (Seligsohn and Levin 2010; Houser 2010), data analysis clearly demonstrates that a 45% reduction in carbon intensity by 2020 will be insufficient to tackle the rate at which total CO2 emissions is currently increasing in China.

01 March 2011

The German defense minister, Karl-Theodor zu Guttenberg, has resigned following the exposure of plagiarism on a massive scale in his PhD dissertation. The figure above shows the results of a page-by-page Wiki effort to "audit" his dissertation. The black and red colors indicate text that was directly (black) or partially (red) copied from other sources. The white parts were judged OK and the blue represents the front and back matter.

Guttenberg's defense of his actions, which were supported by Chancellor Angela Merkel, sought to focus attention on those critiquing him in an effort to downplay the significance of the academic misconduct:

[Guttenberg's] first line of defense, however, appeared to be his repeated insistence that the problems with his dissertation had no bearing on his position at the head of the Defense Ministry -- an effort that Merkel herself had supported, saying last week that she had chosen Guttenberg to head the Defense Ministry and not "as a research assistant."

In his brief statement on Tuesday, Guttenberg once again seemed to blame his critics and the German media for focusing so intently on his dissertation. "If, as has been the case in recent weeks," Guttenberg said in his Tuesday statement, "the attention of the public and the media is almost exclusively focused on the person of Guttenberg and his dissertation instead of, for example, the death and injury of 13 soldiers (eds. note: three German soldiers were killed in Afghanistan last week), then it ... harms the institution I have been tasked with leading."

But on Monday it became clear that academia is furious with the way the chancellor has handled the affair. In an open letter to the chancellor, some 20,000 academics from Germany and around Europe said Merkel's support of Guttenberg was a "mockery" of all those who "contribute to scientific advancement in an honest manner."

"If the protection of ideas is no longer an important value in our society, then we are gambling away our future," the statement reads. "We do not expect gratitude for our scientific work, but we do demand ... respect. The scientific community is suffering as a result of the treatment of the Guttenberg case as a trivial offense. As is Germany's credibility."

Even so, I expect that we will again see Karl-Theodor zu Guttenberg in German politics, and Germany will then re-engage a debate over science, politics, trust and legitimacy.