Neural space–time model for dynamic multi-shot imaging

Computational imaging reconstructions from multiple measurements that are captured sequentially often suffer from motion artifacts if the scene is dynamic. We propose a neural space–time model (NSTM) that jointly estimates the scene and its motion dynamics, without data priors or pre-training. Hence...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Nature methods 2024-12, Vol.21 (12), p.2336-2341
Hauptverfasser: Cao, Ruiming, Divekar, Nikita S., Nuñez, James K., Upadhyayula, Srigokul, Waller, Laura
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Computational imaging reconstructions from multiple measurements that are captured sequentially often suffer from motion artifacts if the scene is dynamic. We propose a neural space–time model (NSTM) that jointly estimates the scene and its motion dynamics, without data priors or pre-training. Hence, we can both remove motion artifacts and resolve sample dynamics from the same set of raw measurements used for the conventional reconstruction. We demonstrate NSTM in three computational imaging systems: differential phase-contrast microscopy, three-dimensional structured illumination microscopy and rolling-shutter DiffuserCam. We show that NSTM can recover subcellular motion dynamics and thus reduce the misinterpretation of living systems caused by motion artifacts. A neural space–time model can recover a dynamic scene by modeling its spatiotemporal relationship in multi-shot imaging reconstruction for reduced motion artifacts and improved imaging of fast processes in living cells.
ISSN:1548-7091
1548-7105
1548-7105
DOI:10.1038/s41592-024-02417-0