Lessons from the 1994 Northridge Quake

Current Earthquake Research at Caltech

Since the magnitude 6.7 Northridge earthquake 20 years ago (January 17, 1994), researchers at the California Institute of Technology (Caltech) have learned much more about where earthquakes are likely to happen, and how danger to human life and damage to property might be mitigated when they do occur.

"The Northridge quake really heralded the beginning of a new era in earthquake research, not only in southern California, but worldwide," says Michael Gurnis, John E. and Hazel S. Smits Professor of Geophysics, and director of the Seismological Laboratory at Caltech.

In the years just prior to the Northridge earthquake, Caltech launched a program called TERRAscope supported by the Whittier foundations, which placed high-quality seismic sensors near where earthquakes occur. The Northridge earthquake was, in effect, the first test of TERRAscope in which Caltech scientists could infer the distribution of an earthquake rupture on subsurface faults and directly measure the associated motion of the ground with greater accuracy. "With a modern digital seismic network, the potential of measuring ground shaking in real time presented itself," says Gurnis. "The real time view also gave first responders detailed maps of ground shaking so that they could respond to those in need immediately after a quake," adds Egill Hauksson, senior research associate at Caltech.

To give us this new view of earthquakes, Caltech collaborated with the U.S. Geological Survey (USGS) and the California Geological Survey to form TriNet, through which a vastly expanded network of instrumentation was put in place across southern California. Concurrently, a new network of continuously operated GPS stations was permanently deployed by a group of geophysicists under the auspices of the Southern California Earthquake Center, funded by the USGS, NASA, NSF, and the Keck Foundation. GPS data are used to measure displacements as small as 1 millimeter per year between stations at any two locations, making it possible to track motions during, between, and after earthquakes. Similar and even larger networks of seismometers and GPS sensors have now been deployed across the United States, especially EarthScope, supported by the NSF, and in countries around the world by various respective national agencies like the networks deployed by the Japanese government.

Initially, says Gurnis, there were not many large earthquakes to track with the new dense network of broadband seismic instruments and GPS devices. That all changed in December 2004 with the magnitude 9.3 earthquake and resulting tsunami that struck the Indian Ocean off the west coast of Sumatra, Indonesia. Quite abruptly, Caltech scientists had an enormous amount of information coming in from the instrumentation in Indonesia previously deployed by the Caltech Techtonics Observatory with support from the Gordon and Betty Moore Foundation. By the time the magnitude 9.0 Tohoku-Oki earthquake hit northern Japan in 2011, the Seismological Laboratory at Caltech had developed greatly expanded computing power capable of ingesting massive amounts of seismic and geodetic data. Within weeks of the disaster, a team led by Caltech professor of geophysics Mark Simons using data from GPS systems installed by the Japanese had produced extensive measurements of ground motion, as well as earthquake models constrained by this data, that provided new insight into the mechanics of plate tectonics and fault ruptures.

The Tohoku-Oki earthquake was unprecedented: scientists estimate that over 50 meters of slip on the subsurface fault occurred during the devastating earthquake. Currently, scientists at Caltech and the Jet Propulsion Laboratory are prototyping new automated systems for exploiting the wealth of GPS and satellite imaging data to rapidly provide disaster assessment and situational awareness as events occur around the globe. "We are now at a juncture in time where new observational capabilities and available computational power will allow us to provide critical information with unprecedented speed and resolution," says Simons.

Earthquakes are notable—and, for many, particularly upsetting—because they have always come without warning. Earthquakes do in fact happen quickly and unpredictably, but not so much so that early-warning systems are impossible. In a Moore Foundation-supported collaboration with UC Berkeley, the University of Washington, and the USGS, Caltech is developing a prototype early-warning system that may provide seconds to tens of seconds of warning to people in areas about to experience ground shaking, and minutes of warning to people potentially in the path of a tsunami. Japan invested heavily in an earthquake early-warning system after the magnitude 6.9 Kobe earthquake that occurred January 17, 1995, on the one-year anniversary of the Northridge earthquake, and the system performed well during the Tohoku-Oki earthquake. "It was a major scientific and technological accomplishment," says Gurnis. "High-speed rail trains slowed and stopped as earthquake warnings came in, and there were no derailments as a result of the quake."

Closer to home, Caltech professor of geophysics Robert Clayton has aided local earthquake detection by distributing wallet-sized seismometers to residents of the greater Pasadena area to keep in their homes. The seismometers are attached to a USB drive on each resident's computer, which is to remain on at all times. The data from these seismometers serve two functions: they record seismic activity on a detailed block-by-block scale, and, in the event of a large earthquake, they can help identify areas that are hardest hit. One lesson learned in the Northridge earthquake was that serious damage can occur far from the epicenter of an earthquake. The presence of many seismometers could help first responders to find the worst-affected areas more quickly after an earthquake strikes.

Caltech scientists have also been playing a leading role in the large multi-institutional Salton Seismic Imaging Project. The project is mapping the San Andreas fault and discovering additional faults by setting off underground explosions and underwater bursts of compressed air and then measuring the transmission of the resulting sound waves and vibrations through sediment. According to Joann Stock, professor of geology and geophysics at Caltech, knowing the geometry of faults and the composition of nearby sediments informs our understanding of the types of earthquakes that will occur in the future, and the reaction of the local sediment to ground shaking.

In addition, Caltech scientists learned much through simulating—via both computer modeling and physical modeling techniques—how earthquakes occur and what they leave in their aftermath.

Computer simulations of how buildings respond during earthquakes recently allowed Caltech professors Thomas Heaton, professor of engineering seismology, and John Hall, professor of civil engineering, to estimate the decrease in building safety caused by the existence of defective welds in steel-frame structures, a problem identified after the Northridge earthquake. Researchers simulated the behavior of different 6- and 20-story building models in a variety of potential earthquake scenarios created by the Southern California Earthquake Center for the Los Angeles and San Francisco areas. The study showed that defective welds make a building significantly more susceptible to collapse and irreparable damage, and also found that stiffer, higher-strength buildings perform better than more flexible, lower-strength designs.

Caltech professor of mechanical engineering and geophysics Nadia Lapusta recently used computer simulations of numerous earthquakes to determine what role "creeping" fault slip might play in earthquake events. It has been known for some time that, in addition to the rapid displacements that trigger earthquakes, land also slips very slowly along fault lines, a process that was thought to stop incoming earthquake rupture. Instead, Lapusta's models show that these "stable segments" may become seismically active in an earthquake, accelerating and even strengthening its motions. Lapusta hypothesizes that this was one factor behind the severity of the 2011 Tohoku-Oki earthquake. Taking advantage of advances in computer modeling, Lapusta and her colleague Jean-Philippe Avouac, Earle C. Anthony Professor of Geology at Caltech, have created a comprehensive model of a fault zone, including both its earthquake activity and its behavior in seismically quiet times.

Physical modeling of earthquakes is carried out at Caltech via collaborative efforts between the Divisions of Geological and Planetary Sciences and of Engineering and Applied Science. A series of experiments conducted by Ares Rosakis, the Theodore von Kármán Professor of Aeronautics and Mechanical Engineering, and collaborators including Lapusta and Hiroo Kanamori, the John E. and Hazel S. Smits Professor of Geophysics, Emeritus, used polymer plates to simulate land masses. Stresses were then created at various angles to the fault lines between the plates to set off earthquake-like activity. The motion in the polymer plates was measured by laser vibrometers while a high-speed camera recorded the movements in detail, yielding unprecedented data on the propagation of seismic waves. Researchers learned that strike-slip faults like the San Andreas may rupture in more than one direction (it was previously believed that these faults had a preferred direction), and that in addition to sliding along a fault, ruptures may occur in a "self-healing" pulselike manner in which a seismic wave "crawls" down a fault line. A third study drew conclusions about how faults will behave—in either a classic cracklike sliding rupture or in a pulselike rupture—depending on the angle at which compression forces strike the fault.

"Northridge was a devastating earthquake for Los Angeles, and there was a massive amount of damage," Gurnis says, "But in some sense, we stepped up to the plate after Northridge to determine what we could do better. And as a result we have ushered in an era of dense, high-fidelity geophysical networks on top of hazardous faults. We've exploited these networks to better understand how earthquakes occur, and we've pushed the limits such that we are now at the dawn of a new era of earthquake early warning in the United States. That's because of Northridge."