Researchers can not determine when and where exactly an earthquake will occur. But, a new study that utilizes supercomputer power accounts for the unique characteristics of the faults in the area, allowing seismologists to better understand what threats in Southern California might exist.
Large earthquakes are infrequent. We simply haven't seen such quakes on most California faults, says Kevin Milner, a computer scientist at the Southern California Earthquake Center and lead author on the new study.
The fact that most faults in California have not hosted a large damaging earthquake since modern records have been kept, says Milner, leaves researchers "to infer what types of earthquakes we think are possible on those faults." This uncertainty creates challenges for hazard assessment and planning.
The fact that most faults have not hosted a major destructive earthquake in California since modern records have been held, Milner notes, leaves researchers "to infer what kinds of earthquakes we believe are feasible on those faults." This ambiguity poses hazard management and planning problems.
Related Article: Japan Earthquake 2021: What Happened 10 Years Ago?
Threat Evaluation
Standard threat evaluation, Milner claims, is empirically based. This means that what scientists know about earthquakes derives from what evidence from historical events can be observed and extrapolated. But analytical models depend, Milner says, on data from seismically active areas across the globe.
They are not unique to the region and may, therefore, overestimate or underestimate an area's danger regardless of factors specific to its faults and geology.
The researchers note that several previous studies have used hybrids of empirical and physics-related models that are based on an interpretation of physical processes and take into account both region-specific data and general data. Milner and collaborators took a new tack, he says: in their model, they used purely physics-based approaches.
Computing Prowess
It took enormous computational power to come up with these calculations, and the team switched to two of the world's largest supercomputers to get them completed.
The first step, generating 714,516 years of simulated earthquakes, took about four days to operate at the Texas Advanced Computing Center, says Milner, on more than 3,500 processors inside Frontera.
The second step, simulating the ground motions resulting from all those earthquakes, was at the Summit, located at the Oakridge National Laboratory of the Department of Energy, and took a similar period of time, says Milner.
Conclusion
Clear findings of improvements to threat plans were not drawn by the researchers, Milner says, citing the need for further studies.
The study indicates that not only can researchers generate simulated quakes using a physics-based approach, but they can use these quakes to model the related ground movements that guide threat preparation. Their observations are consistent with observational approaches, indicating that, Milner notes, the current paradigm is generating reliable results.
Huge Leaps
As a computer seismology team, the group is "doing something really on the furthest edge that not only they but we can go to," says postdoc Marta Pienkowska from the Department of Earth Sciences at ETH Zurich, who was not involved in the study.
The research team agrees that much more analysis is required before this study can advise or change the evaluation of hazards.
"This was a significant development, a proof of concept demonstrating that this sort of model can operate [and that it] can create ground movements that are compatible with our best scientific models," Milner says, "and now it is time to really dig in and vet it and build in some of our uncertainties." Such uncertainties include geometries of faults that are not well defined far below the surface, Milner says."
Also Read: Magnitude 6.4 Earthquake Shakes Croatia
For more news about natural calamities, don't forget to follow Nature World News!