Underwater image enhancement based on a portion denoising adversarial network

Underwater optical images are widely used in marine exploration. Due to the weak light problem caused by water depth, underwater images generally have the characteristics of background noise, dark brightness, strong blue‒green background color, and blurred images. These characteristics bring great i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of intelligent robotics and applications Online 2023-09, Vol.7 (3), p.485-496
Hauptverfasser: Li, Xingzhen, Gu, Haitao, Yu, Siquan, Tan, Yuanyuan, Cui, Qi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Underwater optical images are widely used in marine exploration. Due to the weak light problem caused by water depth, underwater images generally have the characteristics of background noise, dark brightness, strong blue‒green background color, and blurred images. These characteristics bring great inconvenience to marine exploration tasks. In this way, the study of underwater image enhancement has important application value. Most of the existing underwater image enhancement methods mainly solve the problem of the overall denoising and brightness enhancement of the underwater image while ignoring the partial denoising of the image. To solve these problems, this paper proposes an improved generation adversarial network (GAN) to achieve clear processing of underwater images. The main improvements include three aspects. First, a portion denoising module is added to the generator to weaken the image noise produced by the generator in a detailed manner. Second, the acceleration module is introduced into the discriminator to accelerate the training process of the GAN network. Third, the sum of squares of confrontation loss, contrast loss and color loss is used as a loss function to make the training of the GAN network stable. Extensive experimental results show that the proposed model is superior to the comparison method in both quantitative and qualitative experiments, and the visualization results of the results are natural.
ISSN:2366-5971
2366-598X
DOI:10.1007/s41315-023-00279-x