This blog provides updated forecasts and comments on current weather or other topics

Tuesday, May 15, 2012

U.S. Climate Versus Weather Computers: Climate Wins

Why is the U.S. government provide hugely more computer resources for climate prediction than weather prediction? And why is far more emphasis given to climate prediction research than weather prediction research and development?

In some past blogs I talked about the unfortunate lack of computer resources available to the National Weather Service (NWS), resulting in the U.S. trailing behind many international numerical weather prediction centers. This lack of computer resources undermines the ability of the NWS to run high-resolution weather models, to move effectively into probabilistic weather prediction, to enhance hurricane prediction, and much, much more. Far smaller nations, with much more benign weather, such as England and South Korea possess weather supercomputer facilities that dwarf ours.
The cost to the American people of inferior weather computer resources is substantial, both economically and in saving lives.

While U.S. operational weather prediction is provided inadequate computer resources, climate prediction, including studies of potential human-forced global warming, ienjoys the availability of huge supercomputers, with capacities hundreds of times larger than that available for weather prediction.

It is not a little ironic that great emphasis is placed on acquiring state-of-the-art petaflop supercomputers for climate change, while in a year of a dozen billion-dollar weather disasters, the NWS is not given critical tools needed to protect the American people. Someone has their priorities wrong.

Let me give you a few examples. As noted in an earlier blog, the National Weather Service operational computer system has 4992 processors for a total computational capacity of .07 petaflops (a petaflop is one quadrillion floating point operations per second). Keep this .07 petaflop number in mind.

The new 1.1 petaflop GAIA computer just acquired by NOAA

Now consider a few of the climate computers purchased by the U.S. government and associated agencies.

Lets begin with the 1.1 petaflop GAIA computer recently acquired by NOAA, a machine sixteen times more capable than the NWS weather prediction computer. This machine will be dedicated to climate research (see an article here on this machine).

The National Center for Atmospheric Research, an entity mainly funded by the National Science Foundation, is now completing a 70 million dollar facility that includes the new Yellowstone computer, capable of 1.5 petaflops. This machine will be used mainly for climate research (article here). A great irony is that this machine uses HUGE amounts of power, power that will come from coal-fired power plants that emit lots of CO2.

NASA Pleiades supercomputer

Or consider the new "Discover" supercomputer acquired by NASA's Center for Climate Simulation (NCCS) that has .32 petaflop capacity, nearly 5 times that of the NWS forecast machine. The Discover machine is dwarfed by NASA's new "Pleiades" supercomputer (1.3 petaflops) that is being used for climate simulations and other NASA needs.

I could go on and on naming other supercomputers owned by the U.S. Department of Energy and others that are used for climate research--it would be a very long list. The bottom line is that the computational power available for climate simulations, for understanding and predicting climate change over the next few decades to a century, absolutely dwarfs what is available for predicting the weather and for understanding how weather systems work.

This makes no sense.

Imagine if one of the petaflop machines was made available for weather prediction.The forecast skill of the U.S. global weather models could be substantially increased--providing skillful forecasts further out in time. We could run models with enough resolution to get the fine scale structures of hurricanes and other storms. A new age of probabilistic weather prediction could begin, with high-resolution ensembles providing uncertainty information for local weather features. The impact would be huge, saving hundreds of millions or billions of dollars in economic impacts from severe weather, protecting lives, and improving the functioning of our air traffic control and highway systems. Real and profound benefits.

Now don't get me wrong. Understanding and modeling climate change is important. But there are dozens of supercomputers in the U.S. that are quite capable of this task--and remember that climate simulations don't have to be done within a set schedule like weather predictions. And there are many groups around the world doing the same type of global climate simulations--and quite frankly all the better models get essentially the same results. There is a vast overkill in pushing computer resources for climate prediction, while weather prediction is a very poor cousin. And consider the fact that with all the supercomputers available for climate prediction, the uncertainty of the predictions for the next century has remained essentially unchanged. The reason...adding more physics and interactions in the models adds uncertainty since the problem is getting more and more complex.

Why is this gross imbalance happening? That is something I will leave to the comment section of this blog. But it is clear that leadership in NOAA, the Department of Commerce, and in other Federal agencies have let this go on too long, to the detriment of the American people. Our congressional representatives and others need to intervene. The U.S. Office of Management and Budget (OMB) needs to evaluate this situation more carefully.

And don't forget that improved weather prediction is critical for dealing with climate change.

Mankind is doing very little to stop anthropogenic global warming--we are going to do the experiment. We are ALREADY doing the experiment. Thus, adaptation will be critical and what is more important for adaptation than improved weather forecasts? If climate extremes will increase under global warming we need to be able to predict them in the short-term to protect people and property. Furthermore, there is no better way to improve climate models than to improve weather models, since essentially they are the same. You learn about model weaknesses from daily forecast errors in a way you can't do in climate predictions.

Climate Computer Support

Weather Computer Support

We can do BOTH climate and weather prediction, but more balance in resource allocation is needed.

PS: Probcast, the UW high-tech probabilistic prediction system (www.probcast.com), is back up! The software is now on my department servers, so it should be far more dependable.

PS:: My lost dog was spotted in Mountlake Terrace (the person was pretty sure about this)-- near the intersection of 238th place SW and 52nd Ave. If you live around, let me know if you see her! (see right panel for more information)

22 comments:

Our group at GFDL is indeed working on using our high-resolution HiRAM model, originally intended for climate simulation, for weather forecasting purposes. In particular we have been receiving funding from the Hurricane Forecast Improvement Project to use HiRAM as a next-generation hurricane model:

http://journals.ametsoc.org/doi/abs/10.1175/WAF-D-10-05015.1

We are currently doing experimental real-time ten-day global forecasts as well as seasonal predictions. It is a small fraction of Gaea's computing power, but we are doing research (hopefully!) useful for weather forecasting.

Yes, Climate Science has access to a huge amount of research resources. This is possibly because it is the most critical challenge to face humanity in its history.

Perhaps if there wasn't so much idiotic "debate" about whether it's 'real' or not then there wouldn't be such a perceived need to predict what the climate will do in future. It *should* be enough that we know temperatures will rise as will sea levels, and acidity levels in the ocean... but it's not. Why? Well, I would rather not get into that.

The defeatist "we are going to do the experiment." is equally disturbing especially from someone who claims to be concerned about saving lives.

Yes, Weather prediction should have access to and would benefit immensely from some big upgrades in super-computers. But there is no need to make this an 'either/or'.

Repurpose one of those multipetaflob supercomputers for weather prediction and we can have our cake and eat it too.

Chrisale, We can do both...but we need to rebalance where we are putting resources. We can save lives now. We can enhance economic productivity now. We know how to use more computer power to improve weather prediction today. And better forecasting enhances our ability to adapt to climate change. We are doing the climate change experiment, and quite frankly we don't need a single addition global climate run to know we have a problem. Mankind is simply not taking the actions needed to stop significant GW...this is not being defeatist, it is being a realist...cliff

My apologies Cliff if my post came off sounding more accusatory than it should have. I admit I am easily frustrated these days by this topic.

I agree that humanity has, by its inaction, doing part of the experiment.

It just sounded in the original post like you were resigning yourself and humanity to that experiment rather than at least attempting to stop or limit it before catastropic loss of life and economy are inevitable.

I think we're on the same page. I was just a little taken aback by the 'us or them' tone.

I think perhaps this is a symptom of a larger problem the science community battles with... as a layman it seems to me like we spend a ton of money on research and development and other 'new' things, but we forget that the science of *now* is just as important, if not more so to the everyday person.

I am thinking specifically of the loss over the past many decades of dozens of small-city weather sensors at Environment Canada. Downsizing of local forecasting. Even the elimination of lighthouse keepers on the West and East Coast of Canada in 'favour' of automated beacons and sensors. These have all contributed to a general decrease in the breadth and quality of data available.

Well, I always predicted this moment would happen; when the debate against limiting CO2 emissions shifted from "climate change isn't real, so we shouldn't do anything about it" to "it's too late, so we shouldn't do anything about it."

Great question! Even more so when you discover that the climate models turn out to have a simple functional response to the forcings programmed into them. Willis Eschenbach made an interesting analysis of the GISS Model E last year. http://wattsupwiththat.com/2011/05/14/life-is-like-a-black-box-of-chocolates/

I say give the climate modelers desk top PCs and let the weather modelers have the supercomputers!

The point isn't how much computer power climate studies (or any other particular project) has.

It is how much would adequate facilities cost, versus how much we would save and what debacles we would avoid through more accurate forecasts.

Perhaps the point is the small increment to the expenses we are already investing in weather infrastructure. Perhaps other costs in weather forecasting have lost value, and those resources should be shifted to big computers.

The comparison of weather and climate computer costs is an unnecessarily indirect way to make your argument. Research and operational science are simply funded differently. Look at the price tag compared to the economic benefits of the big telescopes and the particle accelerators.

Do you have access to rewrite or request features of the software used to derive the models?If you do would it be possible to use scientific grade GPUs instead of general processing cores for all or part of the computations? Processing power per machine could be increased by 10-100x with much less equipment purchase than a cluster based on general CPUs alone. Existing equipment could be extended with these new features off computer which may or may not be an option.

I think we can do better for both realms without making this an "us vs. them" type comparison (or for me "us vs. us" since I have my feet in both).

I think it is also unfair to assign all of NASA's, NOAA's, and NCAR's computing power to climate. That computing power is shared amongst many, many groups in both science and engineering. Climate modeling is not insignificant, but it is not the sole purpose of those supercomputers.

I also agree with a previous commenter that things like investing in better observing systems or maintaining existing networks would help both realms immensely, since assimilation of data into both weather and climate models seems to improve the quality of the forecasts.

There is some sharing amongst the weather and climate communities, but I think we need to expand on those collaborations, so that we can make advances in both communities while minimizing duplicate effort.

SKM, We are surely on the same side...and I do both climate and weather simulation. But the balance is clearly wrong...with weather prediction starved for resources and U.S. NWP dropping back to third or fourth place. Lives can be saved and the economic health of the U.S. can be aided by better numerical weather prediction...cliff

'Sorry Cliff, but this is only way to ask you this question. Could you recommend places to go on Sunday to maximize chances of seeing the solar eclipse? Olympia forecast is mostly cloudy, 40% chance rain. Thanks.

I agree strongly with devoting more data-crunching resources to weather modeling, if only because understanding "weather change" will give us a more complete picture on what climate change (natural or human-caused) might look like.

What is climatology? It's a way of measuring the prevailing weather conditions of a region in numerical (statistical) form. What is climate change? Basically a change in the climatological numbers. And what changes the numbers? Changes in the weather mechanics that a particular region experiences.

Thus, to fully understand how CO2 and other gases change the climate, we need to understand how their infrared-absorption properties affect the thermodynamics and fluid dynamics of the atmosphere, and the kind of changes in weather regimes brought on by the extra heat absorption. You can't just say "Oh, same old weather patterns, just add 3 to 8 degrees to everything and we're good!" Different regions will see their weather patterns change in dramatic ways - ways that numbers on a climatology chart probably cannot reflect.

Great example here of how the present austerity fad reduces us to fighting like starving animals locked in a cage over a pool of dollars that is unnecessarily small. Add in a dash of irony by remembering the $4 billion+ in tax exemptions accorded US petroleum companies even as they enjoy record profits while changing our weather -and- climate.

As a respected, reasonable and widely followed meteorological researcher Cliff's every word will be seized upon to make a case for defunding climate research, but don't look for 1:1 movement of dollars into weather modelling; it's -all- "big government" and needs to be killed, as far as the tax-dodgers currently controlling public finance are concerned.

See these documents for facts on the ground w/regard to NOAA's budgets for weather and climate research:

Interesting post Cliff, most of which I agree with. If your numbers are right is worthy of pushing a little further. I feel I need to follow up on some of the comments:To Chrisale:I hear and understand your frustration. However, I suspect that no matter how much computer resource is thrown at climate research it won't change public policy in an appreciable way. The fundamental reason for inaction is uncertainty and how it is misunderstood (by the majority) or manipulated (by the minority on BOTH sides). Uncertainty will always exist in climate models just like in their close cousins, weather models. No amount of computer resource is going to change that - it is a policy problem. Most decision makers get it, they just don't want to do anything because the value proposition isn't obvious to them and/or because those that lose have huge amounts of influence.

This leads me to agree with Cliff that priorities are wrong. Let me give you a poster child example...renewable energy. The RE industry is going through a huge contraction mainly because of unstable policy and because natural gas prices are obscenely low due to abundance of gas from fracking. But another important reason is that wind penetration has reached a level where the variability and uncertainty of the wind (and soon solar) matter and affect the overall economics and viability of the industry aside from making system operators very nervous about reliability. In some cases the cost is being allocated back to wind generators making it even more difficult to compete.

But capital costs of developing renewables are have dropped a huge amount. If it were only possible to reduce uncertainty and predict variability, wind would look extremely attractive even against current natural gas prices. But the NWS budget has been hacked again this year, the profiler network is being retired with no replacement, our numerical models are way below par with the rest of the world etc etc. I even heard of the possibility of reducing the frequency of radiosondes which is frankly pathetic. So accurate forecasting is getting harder, not easier.

Yet studies suggest (e.g. WFIP - Google it if interested) that with proper investment in volume observations and rapid refresh modeling, forecasts can be significantly improved. No hope of that because there is no money. So the irony is that the industry that really can do a lot to reduce our carbon footprint is running into a wall, in part because we won't invest in the cheapest way to make it work. Meanwhile an order of magnitude more resource is being channeled into proving a point which we already know to be true while not actually convincing any of those that matter.

"As a respected…Cliff's every word will be seized upon to make a case for defunding climate research…” Sadly, I think this commenter is right. Great care is needed in forums like this, especially from influential people like Cliff, to make sure nothing can be bundled into a sound bite and used out of context.

Priorities ARE wrong in my mind, and if there is a finite fund it SHOULD be spent on weather forecasting because weather forecasting can DIRECTLY help with mitigation. Of course we should continue climate research, and some of that will involve modeling, but since climate modeling is as much a part of the problem as part of the solution because it inherently illustrates uncertainty and because until the models are improved that uncertainty can't really be reduced, it seems ridiculous to prioritize such huge resources to climate modeling. The right place to improve models is in day to day weather prediction because it is (a) directly useful, (b) verifiable in the short term.

If logic prevails resources are indeed perversely allocated. But, the reality is that if you take the money away from computational resources for climate research it is very unlikely that they will end up being used for weather forecasting, and in that reality I must confess I'd rather somebody got them than nobody.

Still having trouble accessing the probcast site. Hope it goes back up soon - it is by far my favorite source for a quick weather check! (especially when my day starts out with the classic Seattle question: is today a rain jacket day, or will I take my chances?)