U-SeqNet: learning spatiotemporal mapping relationships for multimodal multitemporal cloud removal

Optical remotely sensed time series data have various key applications in Earth surface dynamics. However, cloud cover significantly hampers data analysis and interpretation. Despite synthetic aperture radar (SAR)-to-optical image translation techniques emerging as a promising solution, their effect...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:GIScience and remote sensing 2024-12, Vol.61 (1)
Hauptverfasser: Zhang, Qian, Liu, Xiangnan, Peng, Tao, Yang, Xiao, Tang, Mengzhen, Zou, Xinyu, Liu, Meiling, Wu, Ling, Zhang, Tingwei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Optical remotely sensed time series data have various key applications in Earth surface dynamics. However, cloud cover significantly hampers data analysis and interpretation. Despite synthetic aperture radar (SAR)-to-optical image translation techniques emerging as a promising solution, their effectiveness is diminished by their inability to adequately account for the intertwined nature of temporal and spatial dimensions. This study introduces U-SeqNet, an innovative model that integrates U-Net and Sequence-to-Sequence (Seq2Seq) architectures. Leveraging a pioneering spatiotemporal teacher forcing strategy, U-SeqNet excels in adapting and reconstructing data, capitalizing on available cloud-free observations to improve accuracy. Rigorous assessments through No Reference and Full Reference Image Quality Assessments (NR - IQA and FR - IQA) affirm U-SeqNet's exceptional performance, marked by a Natural Image Quality Evaluator (NIQE) score of 5.85 and Mean Absolute Error (MAE) of 0.039. These results underline U-SeqNet's exceptional capabilities in image reconstruction and its potential to improve remote sensing analysis by enabling more accurate and efficient multimodal and multitemporal cloud removal techniques.
ISSN:1548-1603
1943-7226
DOI:10.1080/15481603.2024.2330185