Clint Dawson and Research

The current storm surge prediction process relies on feeding hurricane data into a computational program called ADCIRC, which uses high-performance computers to efficiently generate data about potential outcomes. However, driven by increased performance, computational hardware is becoming more and more diverse in components and organization. And for the predictive program to get the best performance across systems, the code has to keep up with the changes.

Funded by a $3 million National Science Foundation grant, Dawson and collaborators at Louisiana State University, The University of Notre Dame and The University of North Carolina at Chapel Hill are overhauling ADCIRC into a "version 2.0" dubbed STORM that's designed to perform more efficiently across a variety of computer hardware architectures. A goal for STORM is to work twice as fast as ADCIRC, enabling storm surge predictions to be made within an hour of receiving data inputs.

"The idea is how do we keep [the program] up to date and modernize it for the next generation," Dawson said.

Since first being developed in the mid-1990s, ADCIRC has been widely used to simulate and/or predict water flow in coastal areas of the United States. The National Oceanic and Atmospheric Administration, U.S. Army Corp of Engineers, as well as academic researchers, all use the program to help inform their work. And although storm surge prediction is a popular use for the program, the governing equations describing fluid flow can be applied to investigate other research questions. For example, during the Deep Water Horizon oil spill, Dawson used ADCIRC to predict oil dispersal paths up to three days in advance.

The four-year grant pairs Dawson and ICES research associate Craig Michoski as co-principal investigators. They will work with research collaborators from three other universities: Hartmut Kaiser, LSU; Joannes Westerink, ND; and Richard A. Luettich, UNC-Chapel Hill.

Whatever the fluid flow problem being analyzed, the ADCIRC system works by analyzing the interaction between static elements, such as coastal and undersea topography, and dynamic ones, such as how a hurricane influences water height and water velocity. ADCIRC computes the variable interactions, and their extrapolations, through partial differential equation computations, which are run multiple times using various plausible ranges of variables to produce a selection of potential scenarios. The most likely prediction, a collection of data describing the final fluid behavior at a point in time, is the final result. In the case of storm surges, this information is what's used to inform emergency response and evacuation plans, as well as to create maps.

The STORM program will maintain the same ADCIRC functionalities explained above. But the code driving it, called High Performance ParallelX (HPX) will be a completely new foundation for the algorithms, Dawson said.

The HPX code is designed to be flexible and easy to integrate with other code types, and adaptable to diverse computer architectures. By rewriting the code using HPX, STORM will not only be able to run more efficiently on today's super computing systems, but is likely well equipped to handle inevitable changes that will come.

"Where we hope to be in four years is to have a whole new code and a whole new piece of software. And it's going to be a lot of work but it's also necessary work if you want to keep your software useful for the next generation," Dawson said.

At the same time, Dawson says turning ADCIRC into STORM will be an exercise in understanding the history and composition of the original code, which could help in constructing STORM, and other programs in general.

"If you don't do these kinds of projects you lose all this memory of how you got to this point. We're really fortunate to have this opportunity to take all the lessons that we learned and to put it into a new piece of software," Dawson said.