Earthquake simulations : can uncertainties be quantified more accurently ?

09.14.2017

How should the sensitivity and variability of seismograms be addressed in terms of uncertainty ? In other words, how can uncertainties be quantified more accurately in earthquake prediction models ? Some groundbreaking BRGM contributions to an international collaborative project at the point of intersection between four different disciplines : seismology, highperformance computing, treating uncertainty and Big Data.

With the advent of high-performance computing, digital seismogram predictions have rapidly developed to improve seismic risk assessments and reduce risks in future earthquakes. Long thwarted by the difficulties of quantifying uncertainties in modelling work that demands very high computing power to process huge masses of data (up to 140 terabytes), scientists are now opening up new possibilities.

When supercomputers team up with Big Data...

Software tools were already available for predicting the intensity of subsoil movements. EFISPEC3D, for example, used by the BRGM since 2008, is an earthquake simulator that uses a digital method based on spectral finite elements to calculate the intensity of ground movements in complex geological environments (metric data on speed and acceleration over time).

But a major difficulty remained : how to process huge masses of data, such as a detailed geological map in 3D with 50 x 50 x 30 km grid squares, when ground movements have to be predicted at a resolution of just 10 km ? The complexities involved handicapped the quantification of uncertainties and hence predictive accuracy. But within this complexity, could the treatment of inherent modelling uncertainties be better understood and improved through high-performance digital simulations ?

Thanks to our work on teaming up supercomputers with Big Data, this difficulty has now been overcome. Big Data is already widely used to digitally process very large volumes of unstructured data, while supercomputers are fundamental to scientific research. The approach combining the two in predictive geoscience is highly innovative.

These studies should improve the quantification of uncertainties in predictions of ground movements thanks to fast processing and analysis of simulation results from EFISPEC3D.

This 3-year project led by the BRGM began in 2014. The BRGM, Intel France, the King Abdallah University of Science and Technology (KAUST) and the French Earth Sciences Institute (ISTerre) are conducting a collaborative programme to improve the quantification of uncertainties. This involves several disciplines: seismology, computer science and uncertainty studies.

200 000 processor cores

Research engineers from the four organisations pooled their know-how to enrich the methodologies used to predict earthquake risks through closer control of the sources of uncertainty in their models.

The Shaheen II supercomputer at the KAUST extreme computing research lab has 200 000 processor cores that can handle a huge number of simulations. Intel specialists are making fine adjustments to the BRGM's algorithms developed with open-source EFISPEC3D software, while ISTerre is conceptualising detailed and realistic seismological models to provide input for the simulations.

Numerous results are expected : better quantification of seismic movements, more reliable predictions for use by the authorities and results that are much more sensitive to uncertain parameters. The first ground movement maps are already available.

This project holds out considerable promise, as the combination of Big Data and supercomputers opens up prospects for many other applications, including for coastal, climate and water resource issues.