RADANet: Road Augmented Deformable Attention Network for Road Extraction from Complex High-Resolution Remote-Sensing Images

Extracting roads from complex high-resolution remote sensing images to update road networks has become a recent research focus. How to apply the contextual spatial correlation and topological structure of the roads properly to improve the extraction accuracy becomes a challenge in the increasingly c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1
Hauptverfasser: Dai, Ling, Zhang, Guangyun, Zhang, Rongting
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Extracting roads from complex high-resolution remote sensing images to update road networks has become a recent research focus. How to apply the contextual spatial correlation and topological structure of the roads properly to improve the extraction accuracy becomes a challenge in the increasingly complex road environment. In this paper, inspired by the prior knowledge of the road shape and the progress in deformable convolution, we proposed a road augmented deformable attention network (RADANet) to learn the long-range dependencies for specific road pixels. We developed a road augmentation module (RAM) to capture the semantic shape information of the road from four strip convolutions. Deformable attention module (DAM) combines the sparse sampling capability of deformable convolution with the spatial self-attention mechanism. The integration of RAM enables DAM to extract road features more specifically. Furthermore, RAM is placed behind the fourth stage of encoder, and DAM is placed between last four stages of encoder and decoder in RADANet to extract multi-scale road semantic information. Comprehensive experiments on representative public datasets (DeepGlobe and CHN6-CUG road datasets) demonstrate that our RADANet achieves advanced results compared with the state-of-the-art methods.
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2023.3237561