Simultaneous phase aberration compensation and denoising for quantitative phase imaging in digital holographic microscopy with deep learning
In digital holographic microscopy, the quantitative phase image suffers from phase aberrations and coherent noises. To solve these problems, two independent steps are applied sequentially in the reconstruction procedure to compensate for the phase aberrations and denoising. Here we demonstrate for t...
Gespeichert in:
Veröffentlicht in: | Applied optics (2004) 2024-09, Vol.63 (26), p.6931 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In digital holographic microscopy, the quantitative phase image suffers from phase aberrations and coherent noises. To solve these problems, two independent steps are applied sequentially in the reconstruction procedure to compensate for the phase aberrations and denoising. Here we demonstrate for the first time, to the best of our knowledge, that the reconstruction process can be simplified by replacing the two step methods with a deep learning-based algorithm. A convolutional neural network is trained simultaneously for phase aberration correction and denoising from an only wrapped phase map. In order to train the network, a database consists of massive wrapped phase maps as input, and noise-free sample phase maps as labels are constructed. The generated wrapped phase maps include a variety of phase aberrations and faithful coherent noises that are reconstructed from a practical apparatus. The trained network is applied to correct phase aberrations and denoise of both simulated and experimental data for the quantitative phase image. It exhibits excellent performance with output comparable to that reconstructed from the double exposure method for phase aberration correction followed with block-matching and 3D filtering for denoising, while outperforming other conventional two step methods. |
---|---|
ISSN: | 1559-128X 2155-3165 |
DOI: | 10.1364/AO.534430 |