Respiratory compensation in projection imaging using a magnification and displacement model

Respiratory motion during the collection of computed tomography (CT) projections generates structured artifacts and a loss of resolution that can render the scans unusable. This motion is problematic in scans of those patients who cannot suspend respiration, such as the very young or intubated patie...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE Transactions on Medical Imaging 1996-06, Vol.15 (3), p.327-332
Hauptverfasser: Crawford, C.R., King, K.F., Ritchie, C.J., Godwin, J.D.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Respiratory motion during the collection of computed tomography (CT) projections generates structured artifacts and a loss of resolution that can render the scans unusable. This motion is problematic in scans of those patients who cannot suspend respiration, such as the very young or intubated patients. Here, the authors present an algorithm that can be used to reduce motion artifacts in CT scans caused by respiration. An approximate model for the effect of respiration is that the object cross section under interrogation experiences time-varying magnification and displacement along two axes. Using this model an exact filtered backprojection algorithm is derived for the case of parallel projections. The result is extended to generate an approximate reconstruction formula for fan-beam projections. Computer simulations and scans of phantoms on a commercial CT scanner validate the new reconstruction algorithms for parallel and fan-beam projections. Significant reduction in respiratory artifacts is demonstrated clinically when the motion model is satisfied. The method can be applied to projection data used in CT, single photon emission computed tomography (SPECT), positron emission tomography (PET), and magnetic resonance imaging (MRI).
ISSN:0278-0062
1558-254X
DOI:10.1109/42.500141