Neural Fields for Adaptive Photoacoustic Computed Tomography
Photoacoustic computed tomography (PACT) is a non-invasive imaging modality with wide medical applications. Conventional PACT image reconstruction algorithms suffer from wavefront distortion caused by the heterogeneous speed of sound (SOS) in tissue, which leads to image degradation. Accounting for...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Photoacoustic computed tomography (PACT) is a non-invasive imaging modality
with wide medical applications. Conventional PACT image reconstruction
algorithms suffer from wavefront distortion caused by the heterogeneous speed
of sound (SOS) in tissue, which leads to image degradation. Accounting for
these effects improves image quality, but measuring the SOS distribution is
experimentally expensive. An alternative approach is to perform joint
reconstruction of the initial pressure image and SOS using only the PA signals.
Existing joint reconstruction methods come with limitations: high computational
cost, inability to directly recover SOS, and reliance on inaccurate simplifying
assumptions. Implicit neural representation, or neural fields, is an emerging
technique in computer vision to learn an efficient and continuous
representation of physical fields with a coordinate-based neural network. In
this work, we introduce NF-APACT, an efficient self-supervised framework
utilizing neural fields to estimate the SOS in service of an accurate and
robust multi-channel deconvolution. Our method removes SOS aberrations an order
of magnitude faster and more accurately than existing methods. We demonstrate
the success of our method on a novel numerical phantom as well as an
experimentally collected phantom and in vivo data. Our code and numerical
phantom are available at https://github.com/Lukeli0425/NF-APACT. |
---|---|
DOI: | 10.48550/arxiv.2409.10876 |