ClouDet: A Dilated Separable CNN-Based Cloud Detection Framework for Remote Sensing Imagery

Cloud detection is one of the essential procedures in optical remote sensing image processing because clouds are widely distributed in remote sensing images and cause a lot of challenges, such as climate research and object detection. In this article, a lightweight deep-learning-based framework is p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in applied earth observations and remote sensing 2021, Vol.14, p.9743-9755
Hauptverfasser: Guo, Hongwei, Bai, Hongyang, Qin, Weiwei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Cloud detection is one of the essential procedures in optical remote sensing image processing because clouds are widely distributed in remote sensing images and cause a lot of challenges, such as climate research and object detection. In this article, a lightweight deep-learning-based framework is proposed to detect cloud in remote sensing imagery. First, a multiple features fusion strategy is designed to extract learnable manual features and convolution features from visible and near-infrared bands. Then, a lightweight fully convolutional neural network (ClouDet) with a microarchitecture named dilated separable convolutional module is used to extract the multiscale contextual information and gradually recovers segmentation results with the same size as input image, which is more effective for large-scale cloud detection with larger receptive field, less parameters, and lower compute complexity. Third, context pooling is designed to amend the possible misjudgments. Visual and quantitative comparison experiments are conducted on several public cloud detection datasets, which indicates that our proposed method can accurately detect clouds under different conditions, which is more effective and accurate than the compared state-of-the-art methods.
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2021.3114171