Computational optical sectioning with an incoherent multiscale scattering model for light-field microscopy
Quantitative volumetric fluorescence imaging at high speed across a long term is vital to understand various cellular and subcellular behaviors in living organisms. Light-field microscopy provides a compact computational solution by imaging the entire volume in a tomographic way, while facing severe...
Gespeichert in:
Veröffentlicht in: | Nature communications 2021-11, Vol.12 (1), p.6391-11, Article 6391 |
---|---|
Hauptverfasser: | , , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Quantitative volumetric fluorescence imaging at high speed across a long term is vital to understand various cellular and subcellular behaviors in living organisms. Light-field microscopy provides a compact computational solution by imaging the entire volume in a tomographic way, while facing severe degradation in scattering tissue or densely-labelled samples. To address this problem, we propose an incoherent multiscale scattering model in a complete space for quantitative 3D reconstruction in complicated environments, which is called computational optical sectioning. Without the requirement of any hardware modifications, our method can be generally applied to different light-field schemes with reduction in background fluorescence, reconstruction artifacts, and computational costs, facilitating more practical applications of LFM in a broad community. We validate the superior performance by imaging various biological dynamics in Drosophila embryos, zebrafish larvae, and mice.
Light-field microscopy provides volumetric imaging at high speeds, but suffers from degradation in scattering tissue. Here, the authors present an incoherent multiscale scattering model which allows for quantitative 3D reconstruction in complex environments, and demonstrate dynamic imaging in vivo. |
---|---|
ISSN: | 2041-1723 2041-1723 |
DOI: | 10.1038/s41467-021-26730-w |