HPC success stories
Predicting flood impact for the state of Iowa
More than 1,600 communities across the state are at risk for flooding, and researchers at the UI-based Iowa Flood Center are using high-performance computing (HPC) to improve flood monitoring and prediction. With complex mathematical algorithms, the HPC cluster produces projections every 15 minutes of what river levels in each town will be for the next five days.
Severe flooding of the Iowa and Cedar Rivers in 2008 was the catalyst for the center, established with support from the National Science Foundation and the state legislature. A portion of the funding was invested in Helium. A website provides access to the Iowa Flood Information System, which features inundation maps, real-time flood information, forecasts, and interactive visualizations.
Prediction models mimic the aggregation of water, taking into account rainfall, water levels, soil type, information about land use and vegetation, and where crops are in the growth cycle. Different models also factor in unknowns that influence flooding—for example, researchers do not have data on saturation, which means it’s hard to say how much of the rain will infiltrate to the rivers.
Contributing to the challenge of predicting river levels are the state’s size and the need to produce timely forecasts. Iowa consists of over 56,000 square miles of land and 3 million water pathways, and new rainfall data comes through every five minutes. HPC is necessary to make calculations for such an immense area at a pace fast enough to keep up with rapidly changing circumstances.
There is no operational system in the world as detailed as ours,” says Flood Center Director Witold Krajewski, a professor of civil and environmental engineering at the UI. “It’s fair to say that if not for the HPC cluster, we wouldn’t be doing the work we are doing.”
Modeling climate change
One UI researcher using HPC is Pablo Saide, who is pursuing a PhD. in environmental engineering. He is from Santiago, Chile, where air pollution is a major concern. The surrounding Andes Mountains and seasonal weather conditions cause vehicle and industrial emissions to linger, and the local government uses predictions to declare air pollution episodes. If an episode is imminent, measures are taken to reduce smog and people are encouraged to stay indoors. But often it’s too little too late.
Forecasts are conducted 24 hours in advance, but by waiting until an episode is imminent, the efforts to prevent the episode don’t do much good,” he says. “Hospitals are full of people with asthma and kids. And people develop cancer and long-term health problems because of the pollution.”
With HPC, Saide is developing computer model simulations that can forecast air pollution three days in advance. Data for the simulations comes from measuring stations around the city that monitor how the plume moves, horizontally and vertically. The air-quality measurements, along with meteorological and forecasting data, feed into models that can be quickly processed by Helium. This enables scientists to quickly generate high-resolution simulations to predict pollution.
Understanding Huntington disease
HPC is also being used by UI neuroscientists as they study changes in the brains of Huntington disease (HD) patients before they begin to experience symptoms. The research couldlead to earlier interventions for people diagnosed with the hereditary disorder, which causes widespread brain tissue atrophy, interfering with mobility, memory, speech, and mood.
The PREDICT-HD study involves 1,500 research subjects worldwide. Researchers analyze brain scans from the patients over a 10-year period and apply algorithms to extract measurements that quantify the progression of the disease. Measurements include changes in brain volume, tissue composition, structural size, anatomical regions, and cortical depth. Researchers look at how changes in different regions of the brain correlate with psychiatric, behavioral, and cognitive measures.
Testing each algorithm’s effectiveness takes more than 42 hours of computation per imaging scan session. There are 4,400 data sets to test with each method, and many parameters to modify.
Testing the algorithms on a single computer would take two or three years of data processing,” says Hans Johnson, Ph.D., an assistant professor of psychiatry. “Helium allows us to do that in one day.”
For more stories on HPC research and details on UI HPC resources, visit http://hpc.uiowa.edu.