A robust oriented filter-based matching method for multisource, multitemporal remote sensing images
The accurate matching of multisource, multi-temporal remote sensing images is challenging because of significant nonlinear intensity differences (NIDs) and severe geometric distortions. To address these problems, we developed a robust image matching method: oriented filter-based matching (OFM). OFM...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on geoscience and remote sensing 2023-01, Vol.61, p.1-1 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The accurate matching of multisource, multi-temporal remote sensing images is challenging because of significant nonlinear intensity differences (NIDs) and severe geometric distortions. To address these problems, we developed a robust image matching method: oriented filter-based matching (OFM). OFM is insensitive to NIDs, while exhibiting scale and rotational invariance. First, salient feature points with multiscale attributes were detected in the Gaussian-scale space of the input images. Then, the images were convoluted using multi-oriented filters, and unified feature maps were constructed by the extraction of orientation indices using effective data pooling operations. The constructed feature maps were highly resistant to NIDs. Five filters were integrated into the OFM framework to investigate their applicabilities in different application scenarios. Next, a novel rotation-invariant feature descriptor was constructed, using a dominant direction determination approach and a descriptor-grouping strategy. The dominant direction determination approach enables accurate dominant direction estimation, whereas the descriptor-grouping strategy improves the stability of the method under different rotational angles. Finally, brute-force matching was implemented to obtain initial matches; an improved mismatch elimination method was used to identify reliable putative matches. To evaluate the performance of OFM, we created a large dataset comprising 4,427 pairs of multitemporal optical-optical, optical-synthetic aperture radar (SAR), optical-infrared, and optical-depth images. OFM outperformed state-of-the-art methods in terms of number of correct matches, recall, inlier ratio, root mean square error and success rate. Our implement is publicly available 1 . |
---|---|
ISSN: | 0196-2892 1558-0644 |
DOI: | 10.1109/TGRS.2023.3288531 |