By Paul Johnson
The only thing we know for sure about earthquakes is that one will happen again, somewhere on the planet, very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes--and when.
Millions of earthquakes shake the globe every year, the ground suddenly lurching in response to movements of the tectonic plates that form Earth's crust. These plates jostle over, under, and against each other as they shift around the globe. All that shoving and grinding builds up stress along faults--fractures or breaks in the rock of the crust--until something has to give: an earthquake.
On January 12, 2010, an earthquake and tsunami struck Port-au-Prince, Haiti. According to official estimates, 222,570 people killed, 300,000 injured, 1.3 million displaced, 97,294 houses destroyed and 188,383 damaged in the Port-au-Prince area and in much of southern Haiti.
The science of seismology seeks to understand what causes earthquakes by tracking their occurrence, measuring their force, and using sophisticated imaging technology to probe the subsurface geology where they happen. Today we can locate faults, characterize them, and explain many of the stresses building toward their failure. We still don't fully understand the details inside faults or how those details might control the location and timing of earthquakes, but geophysicists and computer scientists at Los Alamos National Laboratory and our colleagues are wielding an array of new tools to study the interactions among earthquakes, precursor quakes (often very small earth movements), and faults. These tools include aquarium-sized experiments in the laboratory that replicate quakes, more sensitive and more densely deployed seismology instruments worldwide producing vast data streams, and supercomputers that can make sense out of this massive data set.
Because it's so hard to observe geologic-scale interactions underground, the Los Alamos team, with collaborators at Penn State, the U.S. Geological Survey, ETH in Zurich, and the Institute of Physics of the Globe and the Ecole Normale in Paris, France, has developed laboratory experiments to figure out when faults might fail. Using an "earthquake machine" built by Chris Marone at Penn State, the team is investigating the role that "fault gouge"--the loose material created by the constant grinding at a fault--may play in triggering and influencing the size of earthquakes. The laboratory machine creates conditions similar to faults with gouge, then submits them to sound waves as surrogate seismic waves.
These experiments have produced strange and startling effects. The team observed that when stress built up on a fault and it approached failure as a miniature earthquake, a series of small precursor quakes rippled through at rates that followed specific patterns. Comparing these results to actual seismic data reveals similar rate failures when precursors are observed before real earthquakes. They also found that the applied sound waves played a key role in triggering laboratory-scale earthquakes by making the gouge more fluid. Amazingly, precursors can trigger major earthquakes thousands of miles away from their origin and often months later.
Frequently no precursors are observed before an earthquake, but that might be because extremely small precursors elude detection. To test that hypothesis, we're bringing the supercomputing horsepower at Los Alamos to bear on the subject, combing through catalogs of historical data to see if smaller-magnitude events in Earth seem to signal precursor events preceding earthquakes in the past. Starting with the laboratory data on simulated precursor quakes and using a technique called machine learning, Los Alamos is "training" a computer program to sift through this limited data set and spot precursors.
After the computer program has taught itself to recognize them, the team will run the program against actual seismic data. We'll then compare the accuracy of those results to more traditional interpretations of the same data. Other data sets from actual seismic monitoring will be added to the experiment in a process called "ground truthing," intended to verify the computer program's predictive accuracy. The goal is to develop a computer program that reviews new data in almost real time and spots the typical seismic signals of precursors heralding an upcoming major earthquake.
Within the next year or so, the team plans to begin use of the newest computers at Los Alamos, some of the most powerful in the world, to crunch the numbers from larger and larger data sets--first from mining areas, then tectonic regions like the 800-mile-long San Andreas fault, and finally worldwide--to reveal previously hidden patterns of seismic signals.
Dreaming big, the team dares to pursue the Holy Grail of seismology: forecasting major earthquakes. That won't happen any time soon. The first level of forecasting will be characterizing when an earthquake might happen within some time span. But as supercomputing power continues to grow, giving researchers unprecedented capabilities for sifting through big data generated by ever-more-sensitive seismic sensors, unforeseen advances in our understanding will certainly drive us closer and closer to accurately forecasting the massive earthquakes that too often wreak havoc on millions of unprepared people around the world.
While Los Alamos maintains technical expertise in seismology and the geodynamics of Earth's crust as a means of monitoring underground nuclear testing worldwide, that expertise could one day alleviate suffering from unexpected earthquakes on a similarly global scale.
Paul Johnson is a geophysicist, Los Alamos National Laboratory Fellow, a Fellow of the Acoustical Society of America, and an American Geophysical Union Fellow in the Laboratory's Geophysics group.