2024 IEEE GRSS Data Fusion Contest - Flood Rapid Mapping
The Challenge Task As a result of climate change, extreme hydrometeorological events are becoming increasingly frequent. Flood rapid mapping products play an important role in informing flood emergency response and management. These maps are generated quickly from remote sensing data during or after...
Gespeichert in:
Hauptverfasser: | , , , , , , , , , , , , , , |
---|---|
Format: | Dataset |
Sprache: | eng |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The Challenge Task As a result of climate change, extreme hydrometeorological events are becoming increasingly frequent. Flood rapid mapping products play an important role in informing flood emergency response and management. These maps are generated quickly from remote sensing data during or after an event to show the extent of the flooding. They provide important information for emergency response and damage assessment. The aim of this challenge is to develop data fusion algorithms that generate flood maps by processing spatial data from a variety of sources. The goal of this challenge is to design and develop an algorithm that will combine multi-source data to classify flood surface water extent–that is, water and non-water areas. Provided data sources include optical and Synthetic Aperture Radar (SAR) remote sensing images as well as a digital terrain model, land-use and water occurrence. The output is a gridded flood map where each grid cell is labeled water or non-water. How to extract water areas from a remote sensing image depends largely on the acquisition technology. This data fusion challenge has two tracks representing this variability. Track-1: Flood rapid mapping with SAR data Track-2: Flood rapid mapping with optical data No guidance is given on the method to be used for data fusion and pixel-wise classification; it could be based on a statistical approach, machine learning, or a combination of different approaches. |
---|---|
DOI: | 10.21227/73zj-4303 |