With hundreds dead and millions either left homeless, or facing huge damage to their properties, catastrophic flooding has devastated many parts of the world. Nature’s fury may seem unstoppable, but can technology at least help us to cope better?
Savage hurricanes and record rains have struck around the world, bringing chaos to many countries and regions, including Bangladesh, India, the Caribbean, China and Texas in the US.
Each year on average, flooding affects 96.9 million people worldwide, and causes $13.7bn (£10bn) in damage, according to the United Nations Office for Disaster Risk Reduction.
Those figures are likely to rise dramatically after this year’s deluges.
“There are now many indications that the incidence of storms and persistent rainfall events is increasing with climate change,” says Dr Justin Butler, chief executive of Brighton-based flood risk assessment firm Ambiental.
But more accurate data from satellites and ground-based sensors, coupled with supercomputer modelling and machine learning, are giving us a clearer picture of which areas are likely to be most affected, says Mr Butler.
“Tens of thousands of simulations need to be run to capture a range of possible events,” he explains, “so the number of calculations that have to be carried out every second runs into the billions.”
This kind of modelling is helping authorities plan flood defences more effectively, insurers price risk more accurately, emergency services improve how they respond, and homeowners take better protective measures.
Ambiental’s “digital terrain models”, built up using Lidar (light distance and ranging) laser technology and other data, map how water flows across urban and rural landscapes.
Its Flowroute modelling engine crunches all the data from sensors on land and in the sky – as well as historic data – to simulate complex flood flow patterns and make predictions.
All this extra data and computing power is helping insurance companies improve their pricing.
Liz Mitchell, founder of flood insurance specialist Flood Assist, says: “Recent flood events have seen insurers invest in more sophisticated data, which is allowing them to assess the flood risk of an individual property rather than that of a postcode.
“You can have many properties registered at a postcode, and some could literally be at the top of a hill and others at the bottom – they would represent very different risks.”
In the 2011 Brisbane floods, Ambiental’s model was 95% accurate, claims Mr Butler, correctly predicting flooding in 19 out of every 20 flooded properties.
More accurate modelling means some householders in previously “high risk” areas could see their premiums coming down.
But other householders, of course, could see their premiums rising after a reassessment.
“The use of data varies greatly from insurer-to-insurer. The more sophisticated the data the more accurate – theoretically – their pricing should be,” says Ms Mitchell.
Given that the cost of winter flooding in the UK last year was estimated to be £1.3bn, insurers are desperate to get a clearer picture of their likely liabilities.
More extreme weather needs more sophisticated monitoring, which is why Nasa has funded the Global Flood Monitoring System (GFMS), an experimental online computer program run by the University of Maryland.
It can make almost real-time flood analysis by combining data from satellites with a sophisticated land surface model that incorporates vegetation cover, soil type, and terrain.
If water cannot soak away into the land, flood surges are more probable, and knowing where these are likely to hit is crucial for saving lives.
Organisations like the Red Cross and the UN World Food Program are already using…