Flood Predictions: A Data User Story

Today, we are back to featuring some data user stories. Thank you to Natasha Randall for sending this post about the importance of public data for her research on flood prediction.
I am a PhD student, and my research topic is on developing deep learning (AI) models for flood prediction. Traditional numerical flood forecasting simulations have extremely high computational, knowledge, cost, and time requirements, and as a result, many regions lack the resources to develop comprehensive flood maps, often with devastating consequences. AI, however, offers a lot of benefits over the traditional methods, as these models are far cheaper and faster to develop, making them much more accessible.
Deep learning models need a huge amount of a variety of data to train on. All of the datasets that I use provide critical information that the model needs in order to effectively predict floods, such as the precipitation occurring before an event or the current soil moisture.
The majority of the datasets I use are generated from satellite missions run by ESA and NASA, which then openly publish the data for the public. It would be impossible for me to get this data for my research without these organisations making it freely available! Open-access datasets also support my motivations for developing flood prediction methodologies that are more accessible to everyone.
Above is a presentation Natasha gave on her research earlier this month. You can check out the data she uses below. And remember, public data are a public good!
- ESA Copernicus Sentinel 2 multispectral satellite data
- ESA Copernicus Sentinel 1 radar satellite data
- ESA Copernicus Digital Elevation Model
- ESA Copernicus Emergency Management Mapping Service
- NASA Global Precipitation Measurements
- NASA Soil Moisture Active Passive Measurements
- Open Street Map permanent water vectors
- SoilGrids soil classes and soil bulk density
- HydroSHEDS hydrological maps