When the Earth Moves

Mexico City, 1985 -- 300 collapsed buildings, up to 10,000 people dead. San Francisco, 1989 -- $5 billion in property loss, 4,000 injured, 62 dead. Calamities like these have led scientists to reevaluate what they know about building structures to withstand major quakes. Research in the wake of these earthquakes and others has shown that local geophysical factors such as basin shape and soil type play a bigger role than previously thought. Because of these factors, the same seismic wave that flattens one block can leave the next unscathed.

"Ground motion can vary significantly over short distances," says Jacobo Bielak, professor of civil and environmental engineering at Carnegie Mellon University. During the 1989 Loma Prieta quake, for instance, earth movement differed significantly at places only 30 meters apart. How is it possible to account for these factors in building design and construction? The problem of understanding ground motion in large basins is a National Science Foundation "grand challenge," and a team of CMU engineers and computer scientists, led by Bielak, in collaboration with seismologists from the University of Southern California and the National University of Mexico, is addressing it. They are developing three-dimensional computer models that realistically depict how earthquakes shake the Earth.

"Our goal," says Bielak, "is to develop the capability to predict, by computer simulation, the ground motion of large basins during strong earthquakes, and to use this capability to study the seismic response of the Greater Los Angeles Basin. If a large earthquake strikes Los Angeles, which regions will be worst stricken? Which seismic frequencies will most amplify the surface ground motion? Answers will guide development of more rational seismic provisions for building codes, leading to safer, more efficient, economical structures." Using the CRAY T3D, Bielak's team generated the most detailed simulations yet developed on seismic response in the San Fernando Valley.

Archimedes

The Los Angeles Basin is an important problem because of its size and large population and also due to its computational complexity. The model must incorporate information about an earthquake's source, its propagation path and the site conditions of a large sedimentary basin. "Resolving excitation frequencies of engineering interest," says Bielak's colleague Omar Ghattas, "pushes the limits of scientific computing. A big part of our work has been developing numerical algorithms that can harness the power of scalable, parallel systems like the T3D and T3E."

CMU computer scientist David R. O'Hallaron and colleague Jonathan Shewchuk developed a set of software tools, called Archimedes, that helps to automate the modeling process on parallel systems. From an input model of basin geology, Archimedes subdivides the region into computational cells. It partitions the computations for a parallel computer by assigning groups of cells to different processors in a way that minimizes communication.

Archimedes partitioned the San Fernando valley into an unstructured mesh, with smaller cells in softer soil, where seismic wavelengths are shorter. This mesh is partitioned into 64 subdomains, indicated by color.

For its initial effort, the team focused on the San Fernando Valley, a smaller problem than the LA Basin. With input from material models developed at San Diego State University, Archimedes subdivided a region 54 by 33 kilometers and a depth of 15 km into an unstructured 3-D mesh of 77 million cells. Archimedes tailors the grid to the local wavelength, 20 times shorter in the softest soil than in the rock that surrounds the valley, so that grid spacing varies from 340 meters in the rock to 21 meters in the softest soil. Even with this wavelength-adaptive methodology, the large scale of the problem requires 13.4 million grid points, resulting in 40 million equations -- probably the largest unstructured mesh problem ever solved, says Ghattas. This adaptive approach will facilitate building models for other earthquake-prone regions, such as Mexico City, Salt Lake City and Tokyo.

A Simulated Aftershock

With the San Fernando Valley model, the team simulated an aftershock of the 1994 Northridge earthquake. The model propagates seismic waves through the basin, simulating motion within the basin as well as on its surface. Each simulation requires about seven hours of clock time on 256 T3D processors, using 16 gigabytes of memory, to simulate 40 seconds of shaking.

Their results reveal the tendency of the basin to respond in a complex pattern, with ground motion nine times greater at some sites than others. "Predicting damage is more complicated than just looking at peak displacements or even the complete history of the ground motion," says Bielak. "If frequencies at a point on the surface correspond to the fundamental modes of a structure, the potential for damage is greater. Knowing how simple structures with different natural frequencies would respond to the ground motion throughout the valley, one can get a much better idea of the anticipated damage." The researchers are processing and analyzing these results.

"The CRAY T3D has been unique in providing the level of performance needed for the San Fernando Valley," adds Bielak. "It allows us for the first time to get to the level of detail we need to realistically simulate the frequencies that do the most damage. Our target application, the entire Greater Los Angeles Basin, is an order of magnitude larger and must be run for multiple scenarios to assess the seismic hazard. The expanded capability of the CRAY T3E gives us hope that this problem is within reach."

Animation of a simulated earthquake in the San Fernando valley. Color depicts the peak magnitude of ground displacement. The simulation covered a 54 x 33 kilometer area, superimposed here on a satellite view of topography, to a depth of 15 km. Los Angeles can be seen in the southeast corner.