6 ways supercomputers help prevent natural hazards from becoming natural disasters

6 ways supercomputer help prevent natural hazards from becoming natural disasters
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.
A computer visualization of Hurricane Ike shows the storm developing in the Gulf of Mexico before making landfall at the Texas coast. In times of emergency, the Texas Advanced Computing Center (TACC) serves as a hub for hazard forecasting, response and recovery.

A computer visualization of Hurricane Ike shows the storm developing in the Gulf of Mexico before making landfall at the Texas coast. In times of emergency, the Texas Advanced Computing Center (TACC) serves as a hub for hazard forecasting, response and recovery.

Texas Advanced Computing Center

Most people know that hurricane forecasts are the products of massive number crunching machines. But how else do supercomputers assist in times of emergency?

Whether it’s a natural disaster like Hurricane Harvey or a man-made disaster like the Deepwater Horizon oil spill, in times of crisis, supercomputers — and supercomputing centers – play an important role in emergency preparations, disaster recovery and prevention.

For over a decade, the Texas Advanced Computing Center (known as TACC) at The University of Texas at Austin has served as a hub for natural hazard forecasting and response. The center helps scientists, decision-makers and first responders get the information they need in ways that can’t be accomplished without powerful computers and computer-savvy experts.

“We can’t change the course of a storm, but when we know it’s coming, we can help minimize its damage to lives and property,” said Dan Stanzione, TACC Executive Director.

By enabling high-speed simulations, real-time data assimilation, geospatial analysis, visualization, machine learning and infrastructure resiliency, supercomputers have an outsized, real-world impact when natural hazards occur.

1. Forecasting storms before they strike

The idea of using mathematics and physics to forecast storms – known as numerical weather prediction — was first proposed at the beginning of the 20th century, but it wasn’t until the 1950s that the first successful numerical storm prediction was performed using the ENIAC digital computer at the University of Pennsylvania.

Staff programming ENIAC at University of Pennsylvania, USA, circa 1946

Staff programming ENIAC at University of Pennsylvania, USA, circa 1946

US Army Photo

In the 1970s, scientists developed the earliest, effective hurricane tracking and storm surge models. These tools have become a fixture of weather forecasting ever since.

During Hurricanes Rita (2005), Gustav (2008), Ike (2008), and most recently Harvey and Irma (2017), TACC played an active role in storm forecasting, disaster preparations and recovery.

With tropical storms bearing down on the Gulf Coast, TACC’s experts sprang to action, offering technical assistance and the use of TACC’s supercomputers (including Stampede2, one of the fastest systems on the planet) to teams of atmospheric scientists trying to determine where the storms would touch down, how high the storm surge would rise, and how flooding would impact the communities that lived there.

Forecast of Hurricane Harvey’s storm surge based on National Hurricane Center forecast winds Advisory 26 at 8/26/17 16:00 CDT. This forecast was generated using the ADCIRC+SWAN Surge Guidance System (ASGS) on TACC’s Lonestar supercomputer.

Forecast of Hurricane Harvey’s storm surge based on National Hurricane Center forecast winds Advisory 26 at 8/26/17 16:00 CDT. This forecast was generated using the ADCIRC+SWAN Surge Guidance System (ASGS) on TACC’s Lonestar supercomputer.

Computational Hydraulics Group, The University of Texas at Austin

Because of their access to massive supercomputers (specifically Lonestar), Clint Dawson, a storm surge expert at The University of Texas at Austin (UT Austin), working with collaborators at LSU and the University of North Carolina, could accurately predict where and when Hurricanes Harvey and Irma would hit the Texas and Florida coasts and how damaging they would be.

The forecasts influenced the course of the evacuations and helped get people out of harm’s way.

“This is the first time that we've been able to really test the system on a real hurricane for Texas at high resolution, and it performed incredibly well,” said Dawson.

The researchers modeled Hurricane Harvey at a resolution of 25 meters, or about the size of a city block, over a very large region — a problem that could only be solved on a handful of supercomputers in the world. With each new storm, the researchers assess and improve their models, while continuing advances in computing lead to even higher-resolution forecasts.

“Our biggest computers today have a hundred times more computational power than they did in 2005. That’s made a real difference in our ability to produce quality, long-range forecasts,” he said.

(Scientists use similar methods on TACC systems to forecast damaging hailstorms and extreme nighttime thunderstorms.)

Next-Gen Models

In addition to helping with the official forecasts, dozens of teams of university researchers from across the nation routinely use TACC’s systems to develop and test the next generation of hurricane forecasting models, working out the kinks before they’re widely adopted.

A snapshot from the Penn State University next-generation, real-time hurricane analysis and forecast system, enabled by TACC systems.

A snapshot from the Penn State University next-generation, real-time hurricane analysis and forecast system, enabled by TACC systems.

Fuqing Zhang and Yonghui Weng, Meterology, Penn State University

“NOAA’s [National Oceanic and Atmospheric Administration] machines are typically busy doing the production forecasts, so researchers use TACC machines to do some evaluation of future architectures,” Stanzione said. “We’re running the next-gen models whenever there’s a storm in the Atlantic to see if the new models get better results.”

It’s this type of parallel, concurrent effort that keeps the field of hazard forecasting moving forward.

2. Data drives better models

Another way that TACC’s systems help improve storm forecasts is through data assimilation, which incorporates observations from weather stations, radar systems and sensors into the numerical model of a storm in real-time.

“If you incorporate real-time data, you can correct your models with what’s happening and get a more accurate forecast,” Stanzione explained.

Depth of inundation in the City of Galveston estimated from debris line elevations and tide gauge records using elevation data collected by the university's aerial LiDAR system.

Depth of inundation in the City of Galveston estimated from debris line elevations and tide gauge records using elevation data collected by the university's aerial LiDAR system.

UT Austin, Center for Space Research

In the case of Hurricane Harvey, TACC helped Gordon Wells, a researcher at the UT Austin’s Center for Space Research and a member of the state task-force that advises Texas on its emergency response, use data from satellites, UAV, and aircraft imagery, to pin down where the storm would make landfall and build products to guide first responders.

“Accurate forecasting is important for many reasons,” Stanzione said. “Safety is the biggest concern, but there’s the economic impact too. Evacuations can cost a million dollars per mile of coastline per day of evacuation.”

In the case of Hurricane Ike, instead of evacuating 200 miles of coast three days before a storm, emergency managers were able to evacuate a 30-mile section of the coast and do it one day before the storm.

“We had satellite data so we could actually tell that the models were correct,” Wells said.

“The economic impact of that is huge,” Stanzione continued. “The more you can target the better off you are.”

3. Supercomputers + geographic information systems = faster hazard forecasts

Geographic information systems (GIS) encompass a range of technologies that can capture, manipulate, analyze and present all types of geographical data. Since the 1960s, scientists have applied GIS to problems in environmental science, defense/intelligence, and – importantly – public safety and disaster risk reduction.

After the earthquakes in Haiti in 2010 and in Myanmar in 2013, researchers used TACC systems to combine satellite data with ground-based GPS readings, enabling first responders to distribute emergency supplies even though many roads were unusable.

More recently, GIS data has been used to address an even more common natural hazard: floods.

Flooding is the most common, and the most destructive, type of natural disaster in the U.S., affecting hundreds of thousands of homes and businesses annually.

There are roughly 2.7 million river basins in the U.S. that flow into and interact with each other. Determining how a given basin will react to a major rain event is tricky. However, by combining GIS data with topographic surveys, real-time weather forecasts, sensor data and other sources of information, it’s possible to make actionable flood predictions.

The National Weather Service has 13 River Forecast Centers covering about 6600 basins, as shown in the map below. National river conditions are published through the NWS Advanced Hydrologic Prediction Service

The National Weather Service has 13 River Forecast Centers covering about 6600 basins, as shown in the map below. National river conditions are published through the NWS Advanced Hydrologic Prediction Service

National Weather Service

David Maidment, a civil engineering researcher at UT Austin, helped develop a new state-of-the-art national flood forecasting system, known as the National Water Model, which launched in August 2016 and relies on advanced computing resources from NOAA.

As a pre-cursor to the model, Maidment used Stampede to calculate the flows of all the rivers into all the basins in the nation. Through a combination of improved computing power and improved software, he was able to do so… in just 10 minutes.

“The fact that we achieved it and can model the whole country is astounding,” Maidment said. “It’s lifted the sense of expectations for the whole country.”

The National Water Model runs short-range forecasts hourly and medium-range and long-range forecast daily. The model provides streamflow predictions for 2.7 million river reaches and informs water-related decisions. NOAA’s previous model was only able to forecast streamflow for 4,000 locations every few hours.

Model forecast output from the National Water Model streamflow guidance for Dec. 12.

Model forecast output from the National Water Model streamflow guidance for Dec. 12.

NOAA, Office of Water Prediction

During Hurricane Harvey, with help from the CyberGIS Center at the University of Illinois at Urbana-Champaign, researchers built inundation maps for the affected counties in Texas and Louisiana using stream flow models run on Stampede2.

“River gauges tell you what’s happening at river banks but there are no good forecasts that fill in what happens in neighborhoods and the areas in between,” said Stanzione. “The flood researchers came to us two days after the storm hit and asked if we could help. It was a quick Labor Day weekend effort to throw all this together. We started Friday morning and had maps out by Friday at 9pm.”

The inundation maps were sent twice daily to emergency operations centers to help direct first responders and damage assessment teams determine what areas they could access, and which neighborhoods needed to be inspected and cleared.

4. Visualizing complex information

Simulations, data and analysis are necessary tools for responding to a disaster, but by themselves, they’re not enough to turn raw data into actionable insights. One also needs scientific visualizations, which represent complex data as images to aid understanding.

“Supercomputers can produce an enormous amount of data around disasters, but visualization is really the key to understanding a great deal of that data,” Stanzione said.

Historically, one of the great scientific visualizations has been hurricane tracks, which turn reams of data related to dozens of separate hurricane simulations into a simple graphic showing the range of possibilities for how a storm might evolve.

Tornado visualizations, too, have enabled scientists (and the public) to better understand how small-scale storms evolve into massive churning monsters.

In 2010, when the Deepwater Horizon oil spill occurred in the Gulf, visualizations of the oil spill forecasts helped direct U.S. Coast Guard ships to the thickest plumes of oil, and, in the worst cases, enabled emergency teams to protect property and ecosystems along the coast.

TACC also helped researchers develop the Texas Pandemic Flu Toolkit, a service that simulate and visualizes the spread of viruses like H1N1 through the state. This helps public health workers respond, should pandemic flu emerge in Texas.

"Seeing a visualization and interacting with the data is probably one of the great enablers that will propel science for the next generation and beyond," said Kelly Gaither, TACC’s Director of Visualization.

5. Predicting (and promoting) resiliency

The need to forecast storms and other disasters as they’re approaching will never go away, but there’s a movement afoot to make cities, communities and structures more resilient to storms before they arrive.

A new project called DesignSafe, co-led by TACC, is developing an online system for earthquake and storm surge scientists to share their data, build on each other’s knowledge and rapidly advance the science of resilience so disasters like Hurricane Harvey and the 1994 Northridge earthquake will be less damaging in the future.

DesignSafe was created to prevent natural hazard events from becoming societal disasters.

DesignSafe was created to prevent natural hazard events from becoming societal disasters.

DesignSafe

“It’s not so much: can we forecast earthquakes or hurricanes, but, given that they exist, can we design buildings and other infrastructure that can withstand them?” Stanzione said.

Funded by a $13.7 million award from the National Science Foundation (NSF), DesignSafe serves as a nexus for thousands of natural hazard researchers – a place where computer codes can be shared, tested and reused; where information about past storms can be stored and referenced; and where new simulations can build on past efforts.

“By sharing not only the original data, but the data analysis and the processing that was done on that data, we can ensure that the data can be reproduced again,” said Ellen Rathje, an engineering professor at UT Austin and one of the leads for DesignSafe. “That gives us confidence as we build off the previous work done by others, which speeds up the progress of science.”

The DesignSafe portal launched in spring 2016 and currently includes a data depot for data curation, as well as cloud-based tools for data analysis, simulations and visualizations to support research.

Scientists have been uploading field reconnaissance data collected after Hurricane Harvey and Hurricane Irma to share with other research teams as they analyze the impacts of these major storms. These observations will be used in many ways: from deeper validation of computer simulation models to improving structural designs and building codes, to increasing the resilience of our public infrastructure to minimize the dangers to human life as well as the economic impact of such events.

“When we have natural hazards, they’re going to impact the infrastructure. But we want to minimize that impact such that the damage is repairable and catastrophic failures don’t happen,” Rathje said. “The key to resilience is: it’s okay to bend, but we don’t want to break. We want it to be repairable after the fact.”

6. Machine learning and the frontiers of forecasting

As computers become more powerful, their outputs become more complex. At the same time, the availability of data – from satellites, observing stations, autonomous drones, smart phone photographs and simulations themselves – is exploding.

These factors are leading scientists to develop improved machine learning capabilities that can make sense of the proliferating data and find trends that humans might miss.

Researcher are using machine learning algorithms to predict hail a day in advance.

Researcher are using machine learning algorithms to predict hail a day in advance.

David Gagne, University of Oklahoma

A group at the University of Oklahoma (OU) who studies hail storms are using machine learning algorithms on Stampede to determine which observations are most relevant for accurate forecasts. They then weight those inputs to produce predictions with greater accuracy.

“Using machine learning, it may be possible to find sources of information that wouldn’t be apparent to a human observer,” said Nathan Snook, a research scientist at OU.

At a TACC-hosted hackathon for scientists interested in applying advanced computing to fight the spread of the Zika virus, researchers used machine learning techniques to automatically search aerial imagery for pools of stagnant water, which serve as potential breeding ground for mosquitos that carry Zika.

A project, developed by Robin Murphy of Texas A&M University, uses computer vision, machine learning and anomaly detection techniques to improve the flight paths of unmanned aerial vehicles and to better locate survivors after a storm.

And a similar project led by Shirley Dyke, from Purdue University uses machine learning and automated visual data analytics to assess damage to buildings, bridges and pipelines exponentially faster than a human could.

“We see the possibility now of doing machine learning across images, to identify damaged buildings or locate survivors at enormous scale and to do it completely automatically without people having to look at the images,” Stanzione said. “There’s so much more data available now. The challenge is figuring out how to use it.”

-----------------

Whether through simulation, modeling and visualization, or machine learning, geospatial analysis and resiliency planning, supercomputers are a critical tool for making sure natural hazards don’t become natural disasters.

Federal, state and institutional investments in centers like TACC and systems like Stampede2 help mitigate the risks to people and property from a range of perils, while also enabling the research community to upgrade its forecasting and preparation capabilities broadly.

“There’s still a long way to go, both in terms of the models and the data that we can feed them to get anything resembling perfectly accurate forecasts,” said Stanzione, “but we certainly can do much more than we used to do and we can do it much more quickly.”

Popular in the Community

Close

What's Hot