Tapping
the Earth Simulator

NCAR modelers are using
the world’s fastest computer, conducting experiments at an
extraordinarily fine resolution

The Earth Simulator includes a total of 640 machines, all
connected by a 12-gigabyte-per-second network. (Photo courtesy
Earth Simulator.)

Like other scientists who model Earth's climate,
CGD's Frank Bryan has often wished for more powerful computers.
When he designs a multi-century simulation of the impacts of climate
change on the world’s oceans, he has to wait for weeks to get results
even though NCAR's supercomputers are among the fastest anywhere.

But Frank and his fellow modelers at CGD now have an even faster option.
A unique agreement between NCAR and the Japanese government has opened
the door to running NCAR’s Community Climate System Model (CCSM)
on the world’s most powerful supercomputer: Japan’s Earth
Simulator.

"We’re using the Earth Simulator to build larger ensembles
of model runs and to conduct simulations of longer periods than we could
do here," says Frank, who began using Japan’s flagship computer
for ocean simulations last year.

By
running the CCSM ocean component model on the Earth Simulator,
researchers can simulate such small-scale features as the
formation of eddies and rings. In the simulation in the inset,
run at a resolution of about
110 kilometers (68 miles), the model estimates the broad-scale
movement of water around South Africa's Cape
of Good Hope. In the simulation below, run at a resolution
of about 11 kilometers (7 miles), the model captures much more
detail and indicates how heat and salt can move from the Indian
Ocean to the South Atlantic in isolated coherent structures
called Agulhas rings. (Courtesy Frank Bryan.)

The $360 million Earth Simulator, based in Yokohama, was developed jointly
by several Japanese agencies to investigate global environmental problems.
Since it began operations in 2002, it has achieved speeds of more than
35 trillion calculations per second. In a single day, it can simulate
20 to 40 years of global climate using CCSM-3, which is the latest version
of NCAR’s highly regarded global climate model.

That’s several times faster than any other existing machine. In
contrast, NCAR’s Blue Sky supercomputer—one of the world’s
fastest supercomputers with a peak speed of 8.3 trillion calculations
per second—can simulate four to five years of global climate in
a day.

This year, NCAR will use its own computers as well as the Earth Simulator
and computers at several Department of Energy centers to run CCSM-3.
By relying on several computer centers, scientists will be able
to conduct a large number of experiments. The simulations will be used
for the much-anticipated 2007 Intergovernmental Panel on Climate Change
(IPPC) report, which will provide an update on the climatic impacts of
rising greenhouse gas levels in the atmosphere.

Once the IPCC experiments are wrapped up later this year, CGD researchers
will use the Earth Simulator through 2006 to run experiments with the
atmosphere and ocean components of CCSM at an extraordinarily fine-scale
resolution: about 50 kilometers for the atmosphere and around 10 kilometers
for the ocean. This is equivalent to tracking climate at points as close
to each other as the distance between Boulder and Denver for the atmosphere,
or from the Mesa Lab to Foothills Lab for the ocean.

Researchers will be able to incorporate small-scale features, such as
mountain ranges and ocean eddies, into climate experiments of several
decades to a century long.

"The Earth Simulator is an incredible resource," says CGD
director Maurice Blackmon, who has been working with Japanese officials
on the agreement to use the Earth Simulator. "This is a chance
to take a giant step forward."

A technical challenge

Few scientists outside Japan have been able to tap the Earth Simulator
for their climate experiments. But NCAR has collaborated for years with
Japan’s nonprofit Central Research Institute of the Electric Power
Industry. The collaborative work has helped Japanese utilities place
power plants where there is minimal risk of storm damage, and it has
provided input to energy policy decision makers.

This long-term collaboration evolved into a successful proposal submitted in
2002 to run CCSM simulations on the Earth Simulator. The project, funded by
the Japanese government, will benefit both NCAR scientists, who are attempting
to better predict future climate, and Japanese officials, who are trying to
determine whether climate change will alter typhoon patterns in the Pacific
Ocean or spur rising sea levels that could affect the nation’s coastal
communities.

The marriage of CCSM with the Earth Simulator, however, has come with a major
technical challenge. The Earth Simulator uses vector-based computers—a
technology that NCAR hasn’t used since its CRAY supercomputers, which
operated from the 1970s to the mid-1990s. NCAR has since moved to scalar architecture.

CGD staffers have worked with colleagues at Japan’s Central Research
Institute of Electric Power Industry, several Department of Energy labs,
and with NEC and Fujitsu software engineers in the United States and Japan
for several months to translate CCSM code from scalar to vector (a process
known as porting). Key staffers involved in that effort include Brian Eaton,
Brian Kauffman, Nancy Norton, Julie Schramm, and Mariana Vertenstein, all of
CGD. Other CGD staffers have also played a role by evaluating results on the
Earth Simulator.

Researchers are also contending with the logistics of working with a supercomputer
on another continent. Much of the data will be shipped back on tapes, which
means scientists have to tightly focus their experiments because of the limited
storage capacity of the tapes.

"It’s a little daunting," explains CGD’s Byron Boville,
who specializes in atmospheric modeling. "You have to be pretty strict
about what you’re outputting."

Twice as many experiments

Global climate models place significant demands on supercomputers because
of the trillions of calculations needed to reproduce the complexities
of Earth’s climate. CCSM, one of the world’s premier climate
models, incorporates the interactions of the atmosphere, oceans,
sea ice, and land cover. It takes into account
the effects of clouds, evaporation, and precipitation; changes in atmospheric
chemistry, volcanic eruptions, and solar output; impacts of snow cover
and plants; and much more information.

Typically, a single experiment consists of an ensemble of several CCSM
runs. If Frank, for example, wanted to detail the impact on the oceans
of an annual 1% increase of atmospheric carbon dioxide, he might create
five CCSM runs, each with different initial conditions. One run could
begin with El Niño conditions (warmer surface waters in certain
parts of the Pacific Ocean), another with La Niña conditions (cooler
surface waters), and so on. Such an ensemble would provide him with a
more complete picture of climate change than a single run.

By having access to the Earth Simulator, scientists will be able to conduct
two or more times as many experiments as they could if they were limited
to NCAR’s computers.

The experiments also will be more far-reaching. Because of limited computer
capabilities, most of CGD’s model IPCC runs simulate climate change
until 2100, when atmospheric levels of carbon dioxide may be more than
twice as high as they are now. Some experiments extend to 2200, but these
are extremely time consuming to run. Using the Earth Simulator, however,
scientists will be able to simulate climate until 2350. This will enable
them to probe “overshoot scenarios,” studying what will happen
to global climate if carbon dioxide levels start to decline.

While conducting scenarios for the IPCC report, scientists will use the
same resolution for Earth Simulator runs as for those on NCAR computers.
That resolution is 1 degree (100 kilometers, or 62 miles) for the ocean
and sea ice, and approximately 2 degrees (200 kilometers, or 64 miles)
for the atmosphere and land surface.

But after finalizing the IPCC scenarios, scientists next year will begin
using the Earth Simulator to produce runs with as much as 10 times the
current resolution. This will enable them to track eddies in the ocean
that, while just a few tens of kilometers across, transport energy and
salinity in ways that profoundly impact climate. The resolution of current
experiments, for example, fails to capture water that moves through the
12-kilometer-wide (7-mile) Straits of Gibraltar and provides salinity
to much of the North Atlantic.

Researchers also are looking forward to incorporating more detailed information
about topographic features such as mountain ranges and valleys to gain
insights into regional climate. At present, the CCSM rounds off the height
of Washington’s Olympic mountains to about 5,000 feet. At fine-scale
resolution, the model would more closely incorporate the height of the
mountain range, which reaches nearly 8,000 feet.

This would enable researchers to more correctly capture the amount of
precipitation that falls as snow as opposed to rain—thereby allowing
them to estimate spring and summer water supply from melting snow.

"The Earth Simulator will allow us to run a coupled climate model
at a higher resolution than any current computer in the United States,"
Maurice says.

In time, researchers also hope to couple NCAR’s Whole Atmosphere
Community Climate Model (WACCM) with CCSM and run them together on the
Earth Simulator. Since WACCM simulates higher levels of the atmosphere,
this would enable scientists to gain new insights into the impacts of
upper-atmospheric chemistry and solar variations on global climate.

Frank is already starting to see results from the Earth Simulator because
the ocean part of CCSM proved relatively easy to port from a scalar-based
to a
vector-based system. He’s conducted preliminary runs at fine-scale
resolution, comparing them with actual observations. He’s satisfied
with the model’s performance so far.

"We do seem to do a good job representing the open ocean eddy fields,"
he says. "Where we still have problems is fast-moving currents
along the coast. On balance, it’s quite a good simulation."
•David
Hosansky