Underwater image recovery utilizing polarimetric imaging based on neural networks

Underwater imaging faces challenges due to complex optical properties in water. Our purpose is to explore the application of polarimetric imaging in image recovery under turbid water based on deep learning. A polarization camera is used to capture the polarization images of objects under water as da...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied optics (2004) 2021-09, Vol.60 (27), p.8419-8425
Hauptverfasser: Zhang, Ran, Gui, Xinyuan, Cheng, Haoyuan, Chu, Jinkui
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Underwater imaging faces challenges due to complex optical properties in water. Our purpose is to explore the application of polarimetric imaging in image recovery under turbid water based on deep learning. A polarization camera is used to capture the polarization images of objects under water as datasets. The method used in our study aims to explore a structure and loss function that is suitable for the model. In terms of the model structure, four pairs of models consisting of polarized version and gray version based on the idea of dense U-Net and information flow were proposed. In the aspect of loss function, the method of combining weighted mean squared error with perceptual loss was proposed and a proper set of loss weights was selected through comparison experiments. Comparing the model outputs, it is found that adding polarized information along with the light intensity information to the model at the very front of the model structure brings about better recovering image. The model structure proposed can be used for image recovery in turbid water or other scattering environments. Since the polarization characteristics are considered, the recovered image has more detailed features than that where only intensity is considered. The results of comparison with other methods show the effectiveness of the proposed method. (C) 2021 Optical Society of America
ISSN:1559-128X
2155-3165
1539-4522
DOI:10.1364/AO.431299