MSCN-NET: Multi-stage cascade neural network based on attention mechanism for Čerenkov luminescence tomography

Čerenkov luminescence tomography (CLT) is a highly sensitive and promising technique for three-dimensional non-invasive detection of radiopharmaceuticals in living organisms. However, the severe photon scattering effect causes ill-posedness of the inverse problem, and the results of CLT reconstructi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of applied physics 2022-11, Vol.132 (17)
Hauptverfasser: Du, Mengfei, Chen, Yi, Li, Weitong, Su, Linzhi, Yi, Huangjian, Zhao, Fengjun, Li, Kang, Wang, Lin, Cao, Xin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Čerenkov luminescence tomography (CLT) is a highly sensitive and promising technique for three-dimensional non-invasive detection of radiopharmaceuticals in living organisms. However, the severe photon scattering effect causes ill-posedness of the inverse problem, and the results of CLT reconstruction are still unsatisfactory. In this work, a multi-stage cascade neural network is proposed to improve the performance of CLT reconstruction, which is based on the attention mechanism and introduces a special constraint. The network cascades an inverse sub-network (ISN) and a forward sub-network (FSN), where the ISN extrapolates the distribution of internal Čerenkov sources from the surface photon intensity, and the FSN is used to derive the surface photon intensity from the reconstructed Čerenkov source, similar to the transmission process of photons in living organisms. In addition, the FSN further optimizes the reconstruction results of the ISN. To evaluate the performance of our proposed method, numerical simulation experiments and in vivo experiments were carried out. The results show that compared with the existing methods, this method can achieve superior performance in terms of location accuracy and shape recovery capability.
ISSN:0021-8979
1089-7550
DOI:10.1063/5.0119787