Neural network application for cloud detection in SPOT VEGETATION images

SPOT VEGETATION is a recent sensor at 1 km resolution for land surface studies. Cloud detection based on this sensor is complicated by the absence of a thermal band. An artificial neural network was thus trained for the cloud detection on atmospherically corrected S1 daily data and on top of the atm...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of remote sensing 2006-02, Vol.27 (4), p.719-736
Hauptverfasser: Jang, Jae-dong, Viau, Alain A., Anctil, François, Bartholomé, Etienne
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:SPOT VEGETATION is a recent sensor at 1 km resolution for land surface studies. Cloud detection based on this sensor is complicated by the absence of a thermal band. An artificial neural network was thus trained for the cloud detection on atmospherically corrected S1 daily data and on top of the atmosphere reflectance P data, from the SPOT VEGETATION system. It consists of a multi-layer perceptron with one hidden sigmoid layer, trained with the Levenberg-Marquardt back-propagation algorithm and generalized by the Bayesian regularization. Two neural networks allowed optimal cloud detections to be obtained. The first used all four bands of S1 data with 13 hidden nodes, and the second employed all four bands of P data with 11 hidden nodes. The multiple-layer perceptrons lead to a cloud detection accuracy of 98.0% and 97.6% for S1 and P data, respectively, when trained to map three predefined values that classify cloud, water and land. The network was further evaluated using three SPOT VEGETATION images taken at different dates. The network detected not only bright thick clouds but also thin or less bright clouds. The analysis demonstrated the superior classification of the network over the standard cloud masks provided with the data.
ISSN:0143-1161
1366-5901
DOI:10.1080/01431160500106892