Dictionary Learning-Based Image Reconstruction for Terahertz Computed Tomography

Terahertz computed tomography (THz CT) demonstrates its advantages in aspects of nonmetallic and nonpolar materials penetration, 3D internal structure visualization, etc. To perform satisfied reconstruction results, it is necessary to obtain complete measurements from many different views. However,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of infrared, millimeter and terahertz waves millimeter and terahertz waves, 2021-08, Vol.42 (8), p.829-842
Hauptverfasser: Zhong, Fasheng, Niu, Liting, Wu, Weiwen, Liu, Fenglin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Terahertz computed tomography (THz CT) demonstrates its advantages in aspects of nonmetallic and nonpolar materials penetration, 3D internal structure visualization, etc. To perform satisfied reconstruction results, it is necessary to obtain complete measurements from many different views. However, this process is time-consuming and we usually obtain incomplete projections for THz CT in practice, which generates artifacts in the final reconstructed images. To address this issue, dictionary learning-based THz CT reconstruction (DLTR) model is proposed in this study. Especially, the image patches are extracted from other state-of-the-art reconstructed images to train the initial dictionary by using the K-SVD algorithm. Then, the dictionary can be adaptively updated during THz CT reconstruction. Finally, the updated dictionary is used for further updating reconstructed images. In order to verify the accuracy and quality of DLTR method, the filtered back-projection (FBP), simultaneous algebraic reconstruction technique (SART), and total variation (TV) reconstruction are chosen as comparisons. The experiment results show that the DLTR method has a good capability for noise suppression and structures preservation.
ISSN:1866-6892
1866-6906
DOI:10.1007/s10762-021-00806-6