Oil Spill Identification based on Dual Attention UNet Model Using Synthetic Aperture Radar Images

Oil spills cause tremendous damage to marine, coastal environments, and ecosystems. Previous deep learning-based studies have addressed the task of detecting oil spills as a semantic segmentation problem. However, further improvement is still required to address the noisy nature of the Synthetic Ape...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of the Indian Society of Remote Sensing 2023, Vol.51 (1), p.121-133
Hauptverfasser: Mahmoud, Amira S., Mohamed, Sayed A., El-Khoriby, Reda A., AbdelSalam, Hisham M., El-Khodary, Ihab A.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Oil spills cause tremendous damage to marine, coastal environments, and ecosystems. Previous deep learning-based studies have addressed the task of detecting oil spills as a semantic segmentation problem. However, further improvement is still required to address the noisy nature of the Synthetic Aperture Radar (SAR) imagery problem, which limits segmentation performance. In this study, a new deep learning model based on the Dual Attention Model (DAM) is developed to automatically detect oil spills in a water body. We enhanced a conventional UNet segmentation network by integrating a dual attention model DAM to selectively highlight the relevant and discriminative global and local characteristics of oil spills in SAR imagery. DAM is composed of a Channel Attention Map and a Position Attention Map which are stacked in the decoder network of UNet. The proposed DAM-UNet is compared with four baselines, namely fully convolutional network, PSPNet, LinkNet, and traditional UNet. The proposed DAM-UNet outperforms the four baselines, as demonstrated empirically. Moreover, the EG-Oil Spill dataset includes a large set of SAR images with 3000 image pairs. The obtained overall accuracy of the proposed method increased by 3.2% and reaches 94.2% compared with that of the traditional UNet. The study opens new development ideas for integrating attention modules into other deep learning tasks, including machine translation, image-based analysis, action recognition, and speech recognition.
ISSN:0255-660X
0974-3006
DOI:10.1007/s12524-022-01624-6