A Hierarchical Consensus Attention Network for Feature Matching of Remote Sensing Images
Feature matching, referring to establishing high reliable correspondences between two or more scenes with overlapping regions, is of extremely significance to various remote sensing (RS) tasks, such as panorama mosaic and change detection. In this work, we propose an end-to-end deep network for mism...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2022, Vol.60, p.1-11 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Feature matching, referring to establishing high reliable correspondences between two or more scenes with overlapping regions, is of extremely significance to various remote sensing (RS) tasks, such as panorama mosaic and change detection. In this work, we propose an end-to-end deep network for mismatch removal, named hierarchical consensus attention network (HCA-Net), which is one of the critical steps in matching pipeline. Unlike existing practices, our HCA-Net does not rely on global geometric constraints and handcrafted structural representations. The key principle of the proposed HCA-Net is to adaptively enhance neighborhood consensus before evaluating correspondence. To this end, we design a consensus attention mechanism to regularize sparse matches directly. More specifically, consensus attention consists of two novel operations: an encoder-decoder module for calculating compatibility scores and a context-based density representation module. Such attention mechanism can be easily plugged into the existing inlier/outlier classification model in a stacked way to reject outliers. We also propose a hierarchical global-aware network for further improving the accuracy of outlier detection. We compare the proposed HCA-Net with seven state-of-the-art algorithms on several datasets (including various RS images), and the results reveal that our method significantly outperforms the other competitors. |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2022.3165222 |