Bob Moser in the vislab

Engineering and computational researchers at The University of Texas at Austin, such as Bob Moser, are leading an $18.7 million research project, known as PECOS, to simulate re-entry vehicles.

Days before the space shuttle Columbia began its ill-fated return to Earth on Feb. 1, 2003, NASA engineers tried to evaluate the severity of damage sustained two weeks before, when a piece of foam had struck the shuttle during takeoff and damaged its thermal protection system.

The system was designed to defend the shuttle and its crew from the intense heat generated during re-entry into the Earth's atmosphere. If its integrity were compromised, so would be the integrity of the entire spacecraft. But the only way to evaluate the damage — and ultimately the safety of the crew — from the ground was to run damage-prediction models using computer software originally designed to assess much smaller pieces of foam than the one that had struck Columbia.

The final risk-assessment report raised some uncertainties about whether protective tiles on the spacecraft could withstand re-entry. Ultimately, however, the report assured NASA that the software used to build the computer models tended to "overestimate" damage, and so the report was accepted.

During re-entry days later, the shuttle broke apart. All seven astronauts aboard died.

A series of investigations and policy changes put a sharper focus on the power — and limitations — of predictive computer models. Perhaps if better models had been available to NASA officials, they would have better understood the damage to the shuttle and more accurately predicted how it would respond during re-entry.

"The trick here is that not only do computational researchers want to make a prediction, but we want to characterize how reliable that prediction is and whether we have confidence in it and to what degree," said Bob Moser, a professor in the Cockrell School of Engineering's Department of Mechanical Engineering and the Institute for Computational Engineering and Sciences (ICES).

Few universities are better equipped than The University of Texas at Austin at doing so. The university has a strong track record in predictive modeling, thanks in part to bold and successful efforts by the Texas Advanced Computing Center (TACC) to attract and deploy federal funding for the world-class supercomputers needed to create large-scale predictive models.

This fall, the center secured a $27.5 million National Science Foundation grant to build one of the most advanced supercomputers in the world on campus — to be called "Stampede." The estimated investment will be more than $50 million over four years, though the project may be renewed in 2017 — enabling four additional years of open science research on a successor system.

The university was also selected in 2008 by the U.S. Department of Energy (DOE) as one of five centers in the nation to lead an $18.7 million research project to study uncertainty quantification. Led by Moser, the research aims to answer the most nagging question for scientists, administrators and policymakers who use computer models to make life and death decisions, such as whether a shuttle will survive re-entry or where flooding will occur during a hurricane, so that evacuations can be coordinated more effectively.

They seek to answer a vital question: How much can we trust supercomputing models?

Because, for all of the sophistication, computing power and data required to build predictive computer models, they still just make predictions. And with a prediction come enormous uncertainties, introduced at every step of building the model.

Uncertainties arise from the measurements used, data collected, interpretation of the data, and parameters and imperfections of the model. Many of these potential uncertainties are often not recognized by researchers, given the complexity of the tasks at hand, like deciding where to drill for oil, modeling climate change or forecasting the outcome of medical procedures.

A rebirth of an old field

ICES brings together a tight-knit and growing number of computational experts in engineering, physics and math, among others. Through interdisciplinary research, the group is transforming the field of predictive science by tackling head-on the questions of uncertainty in computer modeling.

Professor Omar Ghattas works among the 40 or so faculty members addressing the problem. Ghattas, who holds joint appointments in mechanical engineering, ICES and the Jackson School of Geosciences, leads a research group that is modeling a diversity of geophysical phenomena, such as how earthquake stress waves propagate through the Earth and affect urban areas, and how melting ice sheets in Antarctica will increase sea levels.

Ghattas said the challenge is that current predictive models, for the most part, provide a "deterministic" prediction — meaning they spit out results as single numbers rather than offering probabilistic forecasts for each number or outcome represented.

For instance, computer models might predict that sea levels will rise 10 centimeters in the next 20 years, but Ghattas is working to create models that output a probability distribution with the prediction.

"Doing that would allow us to say we're, for example, 95 percent confident that this is how an event will transpire rather than making one broad projection," Ghattas said.

Initially when the computational sciences developed alongside computers in the 1960s and 1970s, computational models were informally viewed as a researcher's best guess. But the field has experienced a rebirth over the past two decades, one in which researchers constantly seek to discover, understand and reduce the uncertainties in their predictive models so that predictions are as accurate and close to real-life outcomes as possible.

The emergence of the uncertainty quantification field was initially spurred in the mid-1990s by the federal government's desire to use computer models to predict the reliability of nuclear weapons. Since then, the toll of high-stake events that could potentially have been better anticipated if improved predictive computer models had been available — like the Columbia disaster, Hurricane Katrina and the World Trade Center collapse after the 9/11 terrorist attacks — has catapulted research on uncertainty quantification to the scientific and engineering forefronts.

"Scientific predictions have been sought since the days of Socrates, but it's only been in the last couple of decades that we realized our accumulated knowledge was insufficient to make precise prediction of important physical events," said Tinsley Oden, associate vice president for research at The University of Texas at Austin, director of ICES and a professor in the Cockrell School's Department of Aerospace Engineering and Engineering Mechanics.

"We quickly learned that we better go back and dissect everything necessary to determine the influence of these uncertainties on our answers," Oden said.

University researchers led by Moser, along with Oden, Ghattas and a dozen other engineering professors, are forging ahead with PECOS, the five-year, DOE-funded interdisciplinary research collaboration formally known as the Center for Predictive Engineering and Computational Sciences.

The center’s charge is to develop the next generation of advanced computational methods for better prediction and simulation of multiscale, multiphysics phenomena. Researchers are developing complex algorithms that can help characterize and reduce uncertainty in models. Although the algorithms are broad enough to apply to scientific problems as diverse as hurricane prediction and oil exploration, the specific focus of PECOS is to provide better analysis of aerospace vehicles re-entering the atmosphere.

Improved algorithms are crucial because, regardless of a supercomputer’s capabilities, it can only produce reliable predictions if the algorithms used in models are as advanced as the computer, said Moser, director of PECOS. Already, the group has developed world-class algorithms and computer codes that are being used by researchers to verify and validate the accuracy of their models.

In addition, Stampede, a new NSF-funded supercomputer, is expected to be up and running in January 2013. It will be built by TACC in partnership with Dell Inc. and Intel Corp.

"Stampede is going to be a tremendous tool for us," said Oden.

Reducing uncertainty

With each new supercomputer deployed on campus during the past 10 years — from TACC’s Lonestar to Ranger, and soon Stampede — university researchers have seen enormous leaps in performance.

Professor Clint Dawson, a professor in ICES and the Cockrell School's Department of Aerospace Engineering and Engineering Mechanics, for example, uses supercomputers to predict storm surge from hurricanes. Models such as Dawson's were viewed with a certain amount of skepticism pre-Katrina, but now Texas' emergency managers rely on them heavily to plan coastal evacuations.

Similarly, Thomas Hughes, a professor in ICES and the Department of Aerospace Engineering and Engineering Mechanics, uses supercomputers to develop blood flow models that are helping guide best practices for cardiologists.

These professors are working to characterize and reduce uncertainties in their models because they understand that, just as with Columbia, the stakes of their predictions are high.

"The fact is, for something like [Columbia], you've got to give information that's analytical to a decision-maker but you've got to quantify it. Are you 90 percent sure, 50 percent or 10 percent sure?" Hughes said. "Because when all is said and done, someone still has to call the shots based on the numbers."