Direct retrieval of Zernike-based pupil functions using integrated diffractive deep neural networks
Retrieving the pupil phase of a beam path is a central problem for optical systems across scales, from telescopes, where the phase information allows for aberration correction, to the imaging of near-transparent biological samples in phase contrast microscopy. Current phase retrieval schemes rely on...
Gespeichert in:
Veröffentlicht in: | Nature communications 2022-12, Vol.13 (1), p.7531-7531, Article 7531 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Retrieving the pupil phase of a beam path is a central problem for optical systems across scales, from telescopes, where the phase information allows for aberration correction, to the imaging of near-transparent biological samples in phase contrast microscopy. Current phase retrieval schemes rely on complex digital algorithms that process data acquired from precise wavefront sensors, reconstructing the optical phase information at great expense of computational resources. Here, we present a compact optical-electronic module based on multi-layered diffractive neural networks printed on imaging sensors, capable of directly retrieving Zernike-based pupil phase distributions from an incident point spread function. We demonstrate this concept numerically and experimentally, showing the direct pupil phase retrieval of superpositions of the first 14 Zernike polynomials. The integrability of the diffractive elements with CMOS sensors shows the potential for the direct extraction of the pupil phase information from a detector module without additional digital post-processing.
Retrieving the pupil phase of a optical beam path is a central problem for imaging systems across scales. The authors use Diffractive Neural Networks to directly extract pupil phase information with a single, compact optoelectronic device. |
---|---|
ISSN: | 2041-1723 2041-1723 |
DOI: | 10.1038/s41467-022-35349-4 |