Image registration for InISAR based on joint translational motion compensation

The registration of inverse synthetic aperture radar (ISAR) images is a key process in interferometric ISAR (InISAR) imaging. According to published literature, there are mainly two solutions for ISAR image registration, one is correlation coefficient based method and the other is parameter estimati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IET radar, sonar & navigation sonar & navigation, 2017-10, Vol.11 (10), p.1597-1603
Hauptverfasser: Wu, Wenzhen, Hu, Pengjiang, Xu, Shiyou, Chen, Zengping, Chen, Jian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The registration of inverse synthetic aperture radar (ISAR) images is a key process in interferometric ISAR (InISAR) imaging. According to published literature, there are mainly two solutions for ISAR image registration, one is correlation coefficient based method and the other is parameter estimation of target angular motion. Though some simulation results are effective, defects of these two methods still need to be studied and solved. Different from precedent methods, the authors try new approach for ISAR image registration in this study and better results are obtained. The authors’ new approach is based on joint translational motion compensation, which consists of two steps namely joint range alignment and joint phase autofocus. The first step realises registration along range direction and the second one realises registration along cross-range direction. The new method achieves ISAR image registration along with the translational motion compensation, thus no extra computation is needed. In addition to high computational efficiency, the new method is more precise compared with precedent methods and works well even under strong noise. Simulation results show the advantages of the proposed method in computing efficiency, precision, robustness and practicability.
ISSN:1751-8784
1751-8792
1751-8792
DOI:10.1049/iet-rsn.2017.0140