Monitored Reconstruction: Computed Tomography as an Anytime Algorithm

Computed tomography is an important technique for non-destructive analysis of an object's internal structure, relevant for scientific studies, medical applications, and industry. Pressing challenges emerging in the field of tomographic imaging include speeding up reconstruction, reducing the ti...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2020, Vol.8, p.110759-110774
Hauptverfasser: Bulatov, Konstantin, Chukalina, Marina, Buzmakov, Alexey, Nikolaev, Dmitry, Arlazarov, Vladimir V.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Computed tomography is an important technique for non-destructive analysis of an object's internal structure, relevant for scientific studies, medical applications, and industry. Pressing challenges emerging in the field of tomographic imaging include speeding up reconstruction, reducing the time required to obtain the X-ray projections, and reducing the radiation dose imparted to the object. In this paper, we introduce a model of a monitored reconstruction process, in which the acquiring of projections is interspersed with image reconstruction. This model allows to examine the tomographic reconstruction process as an anytime algorithm and consider a problem of finding the optimal stopping point, corresponding to the required number of X-ray projections for the currently scanned object. We outline the theoretical framework for the monitored reconstruction, propose ways of constructing stopping rules for various reconstruction quality metrics and provide their experimental evaluation. Due to stopping at different times for different objects, the proposed approach allows to achieve a higher mean reconstruction quality for a given mean number of X-ray projections. Conversely, fewer projections on average are used to achieve the same mean reconstruction quality.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3002019