Fusion of the Multisource Datasets for Flood Extent Mapping Based on Ensemble Convolutional Neural Network (CNN) Model
Floods, as one of the natural hazards, can affect the environment, damage the infrastructures, and threaten human lives. Due to climate change and anthropogenic activities, floods occur in high frequency all over the world. Therefore, mapping of the flood areas is of prime importance in disaster man...
Gespeichert in:
Veröffentlicht in: | Journal of sensors 2022-03, Vol.2022, p.1-20 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Floods, as one of the natural hazards, can affect the environment, damage the infrastructures, and threaten human lives. Due to climate change and anthropogenic activities, floods occur in high frequency all over the world. Therefore, mapping of the flood areas is of prime importance in disaster management. This research presents a novel framework for flood area mapping based on heterogeneous remote sensing (RS) datasets. The proposed framework fuses the synthetic aperture radar (SAR), optical, and altimetry datasets for mapping flood areas, and it is applied in three main steps: (1) preprocessing, (2) deep feature extraction based on multiscale residual kernel convolution and convolution neural network’s (CNN) parameter optimization by fusing the datasets, and (3) flood detection based on the trained model. This research exploits two large-scale area datasets for mapping the flooded areas in Golestan and Khuzestan provinces, Iran. The results show that the proposed methodology has a high performance in flood area detection. The visual and numerical analyses verify the effectiveness and ability of the proposed method to detect the flood areas with an overall accuracy (OA) higher than 98% in both study areas. Finally, the efficiency of the designed architecture was verified by hybrid-CNN and 3D-CNN methods. |
---|---|
ISSN: | 1687-725X 1687-7268 |
DOI: | 10.1155/2022/2887502 |