Depth-extended acoustic-resolution photoacoustic microscopy based on a two-stage deep learning network
Acoustic resolution photoacoustic microscopy (AR-PAM) is a major modality of photoacoustic imaging. It can non-invasively provide high-resolution morphological and functional information about biological tissues. However, the image quality of AR-PAM degrades rapidly when the targets move far away fr...
Gespeichert in:
Veröffentlicht in: | Biomedical optics express 2022-08, Vol.13 (8), p.4386-4397 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Acoustic resolution photoacoustic microscopy (AR-PAM) is a major modality of photoacoustic imaging. It can non-invasively provide high-resolution morphological and functional information about biological tissues. However, the image quality of AR-PAM degrades rapidly when the targets move far away from the focus. Although some works have been conducted to extend the high-resolution imaging depth of AR-PAM, most of them have a small focal point requirement, which is generally not satisfied in a regular AR-PAM system. Therefore, we propose a two-stage deep learning (DL) reconstruction strategy for AR-PAM to recover high-resolution photoacoustic images at different out-of-focus depths adaptively. The residual U-Net with attention gate was developed to implement the image reconstruction. We carried out phantom and
experiments to optimize the proposed DL network and verify the performance of the proposed reconstruction method. Experimental results demonstrated that our approach extends the depth-of-focus of AR-PAM from 1mm to 3mm under the 4 mJ/cm
light energy used in the imaging system. In addition, the imaging resolution of the region 2 mm far away from the focus can be improved, similar to the in-focus area. The proposed method effectively improves the imaging ability of AR-PAM and thus could be used in various biomedical studies needing deeper depth. |
---|---|
ISSN: | 2156-7085 2156-7085 |
DOI: | 10.1364/BOE.461183 |