Accurate Flood Forecasting. Presented by IEDRO, the International Environmental Data Rescue Organization
Flooding incidents are all too common. In fact, it’s the most common national disaster worldwide. Flooding makes up 40% of all natural disasters, and accounts for 15% of all resulting deaths.
As we all know, the damage caused by flooding can be devastating. In the last decade of the 20 th century alone, about 100,000 people were killed by floods. Over a billion people were affected in some way.
What exactly causes flooding? Where and how does it begin?
Flooding starts at the riverflow. Where the river flows upstream, the highest point of a watershed, a series of complicated events can trigger a flood.
First, precipitation hits the basin. This starts into motion a chain of events, causing water to be delayed, or lost, on its journey to the river.
The biggest water loss is caused by evapotranspiration, or the discharge of water from the earth’s surface into the atmosphere. This loss depends on factors like air temperature, wind speed, humidity and radiation from the sun.
Basin characteristics, like slope, features and types of vegetation and soil can slow the flow of water. Ice can also clog river channels.
When river waters enter the ocean, tides can push the water back into the channel, causing further delays.
The entire water cycle affects river flow. Water falls in the form of precipitation, spills over from basins, runs off of mountains and enters the ocean. It reenters the atmosphere through such processes as sublimation, evapotranspiration and condensation. The cycle starts over.
Researchers use two common approaches to predict flood levels. The first is the “statistical” approach. This approach does not consider any of the physical, natural events that take place in and over the watershed. The statistical method is most often used when the basin is small, and the flow is not controlled by reservoirs.
The statistical approach uses stream gauge records, or rainfall records, to calculate the chances of water rising to certain levels.
The other method is the “conceptual” approach. Unlike the statistical method, the conceptual approach looks at the physical events that affect riverflow. This approach is most effective with controlled rivers.
The statistical approach uses extreme value analysis when studying streamflow records. These records have to exists continuously for several years. This way, researchers can ensure the records are accurate.
Structures such as spillways, bridges and culverts are constructed to withstand floods that can happen every 50 to 100 years. During flooding events, forecasters pay particular attention when the floodwaters reach levels higher than past predictions.
When researchers use extreme value analysis, they first sort through all the records they have gathered on the riverflow. They find the highest flow amount records for each year, assemble the values in ascending order, and them analyze them using statistical information.
Researchers assign a probability of occurrence to various water flows, using estimates of 50 to 100 year periods. These probabilities allow researchers to develop flood forecasts.
To review the statistical method, extreme value analysis is used to sort and measure riverflows each year.
The maximum river flows are then recorded, and put in ascending order.
The conceptual method, to review, considers the natural events that happen in and around the watershed. Researchers use computer models to simulate the events, both atmospheric and hydrologic, that occur. Researchers gather information from a large number of weather locations when using this approach.
The conceptual method, like the statistical method, uses streamflow records that are continuous to ensure accuracy.
The conceptual method takes many physical activities and factors into account, such as precipitation and solar radiation. If radiation measurements are not available, researchers will consider humidity, cloud cover, wind speed and temperature.
Evapotranspiration, and other ways water can be lost to the atmosphere, are also considered using the conceptual approach. Stream gauge records are then calibrated, and the computer simulation begins, using the researchers observations and forecasts. The computer model then estimates expected rises in water levels.
The statistical and conceptual approach are utilized for more than just flood predictions. They also aid in the creation and operation of facilities like reservoirs and dams. Historical, climatological and stream gauge data all play important roles in the accuracy of both methods.
Many scientists are certain that flooding will increase in the future, causing greater amounts of damage and loss of life. In ten years, Europe may experience more flash flooding. In seventy years, up to 100 million people may be endangered by coastal flooding.
When we study flooding , we better understand its causes, how to predict it and even prevent it. We can prepare for the future by studying the past, thus saving the lives of future generations, and possibly even our own.
Two common approaches are used to predict flood levels.
a) Statistical: doesn’t take into account any of the physical events that take place in and over the watershed. This method is best used when the basin is small and the flow is not controlled by reservoirs.