Abstract

The amount of seismic energy radiated from an earthquake is a key macroscopic parameter for understanding the physics of earthquakes. To resolve the important details of the earthquake process, we require more accurate estimates of energy than are currently available. In this study, we determine the energy radiated from the Mw 7.1 16 October 1999 Hector Mine, California, earthquake using regional data from 67 TriNet stations. Earlier estimates of radiated energy from regional data used empirical path and station attenuation corrections. Here, we remove the path and station attenuation effects by using an empirical Green's function deconvolution. We use one foreshock and four aftershocks as empirical Green's functions and determine the source spectra. The radiated energy at each of the regional stations is computed from these source spectra. The energy estimates from the regional data are tightly clustered, with a mean estimate of 3.0 × 1015 J and a standard deviation of 0.9 × 1015 J. To calibrate the teleseismic methods currently used, we compare the energy estimates obtained above with the energy computed using two different teleseismic methods. The first method is based on the conventional National Earthquake Information Center (NEIC) method, in which the energy flux at each station is computed by squaring and integrating the corrected velocity spectrum. In the second method, we compute Green's functions for the appropriate source structure and deconvolve these from the mainshock data to obtain the source spectrum at each station. The energy is then calculated from the source spectra. The teleseismic energy estimates have a mean of 1.8 × 1015 J and 2.0 × 1015 J for the two methods, respectively. The average estimate of radiated energy from teleseismic data is nearly the same as that obtained from the regional data (3 × 1015 J). From the mean radiated energy (3 × 1015 J) and moment (6 × 1019 N m) estimates, the energy-to-moment ratio for the Hector Mine earthquake is 5 × 10–5.