Application of Generative Adversarial Network-Based Inversion Algorithm in Imaging 2-D Lossy Biaxial Anisotropic Scatterer

An effective quasi-real-time inversion algorithm based on the super-resolution generative adversarial network (SR-GAN) is proposed to quantitatively image the 2-D biaxial anisotropic scatterers. The SR-GAN was originally proposed for the purpose of super-resolution image reconstruction, which exactl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on antennas and propagation 2022-09, Vol.70 (9), p.8262-8275
Hauptverfasser: Ye, Xiuzhu, Du, Naike, Yang, Daohan, Yuan, Xujin, Song, Rencheng, Sun, Sheng, Fang, Daining
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:An effective quasi-real-time inversion algorithm based on the super-resolution generative adversarial network (SR-GAN) is proposed to quantitatively image the 2-D biaxial anisotropic scatterers. The SR-GAN was originally proposed for the purpose of super-resolution image reconstruction, which exactly fits the need for inverse problem. In addition, Visual Geometry Group (VGG) loss is introduced to extract the high-level features of the object instead of the low-level pixel-wise error measures. The angle-dependent reconstruction effect due to the dipole radiation as in the traditional inversion methods is effectively resolved by the machine learning method. Numerical results using both synthetic data and experimental data are given to validate the effectiveness of the proposed method. Both the imaging quality and resolution are greatly improved by the proposed SR-GAN algorithm, compared to the traditional iterative inversion algorithm. In addition, the computational time is reduced significantly and the quasi-real-time imaging is finally realized, which promises a potential real-time application of the inverse scattering method.
ISSN:0018-926X
1558-2221
DOI:10.1109/TAP.2022.3164198