Real-time CBCT imaging and motion tracking via a single arbitrarily-angled x-ray projection by a joint dynamic reconstruction and motion estimation (DREME) framework

Real-time cone-beam computed tomography (CBCT) provides instantaneous visualization of patient anatomy for image guidance, motion tracking, and online treatment adaptation in radiotherapy. While many real-time imaging and motion tracking methods leveraged patient-specific prior information to allevi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Physics in medicine & biology 2025-01
Hauptverfasser: Shao, Hua-Chieh, Mengke, Tielige, Pan, Tinsu, Zhang, You
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Real-time cone-beam computed tomography (CBCT) provides instantaneous visualization of patient anatomy for image guidance, motion tracking, and online treatment adaptation in radiotherapy. While many real-time imaging and motion tracking methods leveraged patient-specific prior information to alleviate under-sampling challenges and meet the temporal constraint (< 500 ms), the prior information can be outdated and introduce biases, thus compromising the imaging and motion tracking accuracy. To address this challenge, we developed a framework (DREME) for real-time CBCT imaging and motion estimation, without relying on patient-specific prior knowledge. Approach: DREME incorporates a deep learning-based real-time CBCT imaging and motion estimation method into a dynamic CBCT reconstruction framework. The reconstruction framework reconstructs a dynamic sequence of CBCTs in a data-driven manner from a standard pre-treatment scan, without utilizing patient-specific knowledge. Meanwhile, a convolutional neural network-based motion encoder is jointly trained during the reconstruction to learn motion-related features relevant for real-time motion estimation, based on a single arbitrarily-angled x-ray projection. DREME was tested on digital phantom simulation and real patient studies. Main results: DREME accurately solved 3D respiration-induced anatomic motion in real time (~1.5 ms inference time for each x-ray projection). In the digital phantom study, it achieved an average lung tumor center-of-mass localization error of 1.2±0.9 mm (Mean±SD). In the patient study, it achieved a real-time tumor localization accuracy of 1.6±1.6 mm in the projection domain. Significance: DREME achieves CBCT and volumetric motion estimation in real time from a single x-ray projection at arbitrary angles, paving the way for future clinical applications in intra-fractional motion management. In addition, it can be used for dose tracking and treatment assessment, when combined with real-time dose calculation.
ISSN:0031-9155
1361-6560
1361-6560
DOI:10.1088/1361-6560/ada519