Underwater Image Enhancement Based on Zero-Reference Deep Network

Due to underwater light absorption and scattering, underwater images usually suffer from severe color attenuation and contrast reduction. Most mainstream underwater image processing methods based on deep learning require a large amount of underwater paired training data, leading to a complex network...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of oceanic engineering 2023-07, Vol.48 (3), p.1-22
Hauptverfasser: Huang, Yifan, Yuan, Fei, Xiao, Fengqi, Lu, Jianxiang, Cheng, En
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 22
container_issue 3
container_start_page 1
container_title IEEE journal of oceanic engineering
container_volume 48
creator Huang, Yifan
Yuan, Fei
Xiao, Fengqi
Lu, Jianxiang
Cheng, En
description Due to underwater light absorption and scattering, underwater images usually suffer from severe color attenuation and contrast reduction. Most mainstream underwater image processing methods based on deep learning require a large amount of underwater paired training data, leading to a complex network structure, longer training time, and higher computational cost. To address this problem, a novel Zero-Reference Deep Network for Underwater Image Enhancement (Zero-UIE) is proposed in this paper, which transforms the enhancement of an underwater image into a specific parameter map estimation by using a deep network. The underwater curve model based on the classical haze image formation principle is specially designed to remove underwater color dispersion and cast. A lightweight deep network is designed to estimate the dynamic adjustment parameters of the underwater curve model, and then adjust the dynamic range of the given image pixels according to the model. A set of non-reference loss functions are designed according to the characteristics of underwater images, which can implicitly drive the network learning. In addition, adaptive color compensation can be optionally used as the pre-processing step to further improve the robustness and visual performance. The significant contribution of the proposed method is zero reference, i.e., it does not require any paired or unpaired reference data for training. Extensive experiments on various benchmarks demonstrate that the proposed method is superior to state-of-the-art methods subjectively and objectively, which is competitive and applicable to diverse underwater conditions. Most importantly, it is an innovative exploration of zero reference for underwater image enhancement.
doi_str_mv 10.1109/JOE.2023.3245686
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_10091685</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10091685</ieee_id><sourcerecordid>2837133195</sourcerecordid><originalsourceid>FETCH-LOGICAL-c292t-e62a8fb74549ddfa0043a00ae1514d4bf79ab0a3e3c65f369a8669c1cba575f33</originalsourceid><addsrcrecordid>eNpNkE1PAjEQhhujiYjePXho4nmx06_dHhFBMUQSIxcvTXd3qqDsYruE-O8tgYOXmWTmeWeSh5BrYAMAZu6e5-MBZ1wMBJdKF_qE9ECpIgNt4JT0mNAyM0yZc3IR44oxkDI3PTJcNDWGnesw0OnafSAdN5-uqXCNTUfvXcSatg19x9Bmr-gxYNrRB8QNfcFu14avS3Lm3XfEq2Pvk8Vk_DZ6ymbzx-loOMsqbniXoeau8GUulTR17R1jUqTiEBTIWpY-N65kTqCotPJCG1dobSqoSqfyNBB9cnu4uwntzxZjZ1ftNjTppeWFyEEIMCpR7EBVoY0xoLebsFy78GuB2b0om0TZvSh7FJUiN4fIEhH_4cyALpT4Azp5Yzs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2837133195</pqid></control><display><type>article</type><title>Underwater Image Enhancement Based on Zero-Reference Deep Network</title><source>IEEE Electronic Library (IEL)</source><creator>Huang, Yifan ; Yuan, Fei ; Xiao, Fengqi ; Lu, Jianxiang ; Cheng, En</creator><creatorcontrib>Huang, Yifan ; Yuan, Fei ; Xiao, Fengqi ; Lu, Jianxiang ; Cheng, En</creatorcontrib><description>Due to underwater light absorption and scattering, underwater images usually suffer from severe color attenuation and contrast reduction. Most mainstream underwater image processing methods based on deep learning require a large amount of underwater paired training data, leading to a complex network structure, longer training time, and higher computational cost. To address this problem, a novel Zero-Reference Deep Network for Underwater Image Enhancement (Zero-UIE) is proposed in this paper, which transforms the enhancement of an underwater image into a specific parameter map estimation by using a deep network. The underwater curve model based on the classical haze image formation principle is specially designed to remove underwater color dispersion and cast. A lightweight deep network is designed to estimate the dynamic adjustment parameters of the underwater curve model, and then adjust the dynamic range of the given image pixels according to the model. A set of non-reference loss functions are designed according to the characteristics of underwater images, which can implicitly drive the network learning. In addition, adaptive color compensation can be optionally used as the pre-processing step to further improve the robustness and visual performance. The significant contribution of the proposed method is zero reference, i.e., it does not require any paired or unpaired reference data for training. Extensive experiments on various benchmarks demonstrate that the proposed method is superior to state-of-the-art methods subjectively and objectively, which is competitive and applicable to diverse underwater conditions. Most importantly, it is an innovative exploration of zero reference for underwater image enhancement.</description><identifier>ISSN: 0364-9059</identifier><identifier>EISSN: 1558-1691</identifier><identifier>DOI: 10.1109/JOE.2023.3245686</identifier><identifier>CODEN: IJOEDY</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Attenuation ; Benchmarks ; Color ; Colour ; Data models ; Deep learning ; Deep network ; Electromagnetic absorption ; Haze ; Image color analysis ; Image contrast ; Image enhancement ; Image processing ; Image restoration ; Light absorption ; Mathematical models ; Parameter estimation ; Scattering ; Task analysis ; Training ; Underwater ; underwater image ; zero reference</subject><ispartof>IEEE journal of oceanic engineering, 2023-07, Vol.48 (3), p.1-22</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c292t-e62a8fb74549ddfa0043a00ae1514d4bf79ab0a3e3c65f369a8669c1cba575f33</citedby><cites>FETCH-LOGICAL-c292t-e62a8fb74549ddfa0043a00ae1514d4bf79ab0a3e3c65f369a8669c1cba575f33</cites><orcidid>0000-0003-2121-6110 ; 0000-0002-6883-9158 ; 0000-0001-5721-3267 ; 0000-0002-8614-8756</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10091685$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10091685$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Huang, Yifan</creatorcontrib><creatorcontrib>Yuan, Fei</creatorcontrib><creatorcontrib>Xiao, Fengqi</creatorcontrib><creatorcontrib>Lu, Jianxiang</creatorcontrib><creatorcontrib>Cheng, En</creatorcontrib><title>Underwater Image Enhancement Based on Zero-Reference Deep Network</title><title>IEEE journal of oceanic engineering</title><addtitle>JOE</addtitle><description>Due to underwater light absorption and scattering, underwater images usually suffer from severe color attenuation and contrast reduction. Most mainstream underwater image processing methods based on deep learning require a large amount of underwater paired training data, leading to a complex network structure, longer training time, and higher computational cost. To address this problem, a novel Zero-Reference Deep Network for Underwater Image Enhancement (Zero-UIE) is proposed in this paper, which transforms the enhancement of an underwater image into a specific parameter map estimation by using a deep network. The underwater curve model based on the classical haze image formation principle is specially designed to remove underwater color dispersion and cast. A lightweight deep network is designed to estimate the dynamic adjustment parameters of the underwater curve model, and then adjust the dynamic range of the given image pixels according to the model. A set of non-reference loss functions are designed according to the characteristics of underwater images, which can implicitly drive the network learning. In addition, adaptive color compensation can be optionally used as the pre-processing step to further improve the robustness and visual performance. The significant contribution of the proposed method is zero reference, i.e., it does not require any paired or unpaired reference data for training. Extensive experiments on various benchmarks demonstrate that the proposed method is superior to state-of-the-art methods subjectively and objectively, which is competitive and applicable to diverse underwater conditions. Most importantly, it is an innovative exploration of zero reference for underwater image enhancement.</description><subject>Attenuation</subject><subject>Benchmarks</subject><subject>Color</subject><subject>Colour</subject><subject>Data models</subject><subject>Deep learning</subject><subject>Deep network</subject><subject>Electromagnetic absorption</subject><subject>Haze</subject><subject>Image color analysis</subject><subject>Image contrast</subject><subject>Image enhancement</subject><subject>Image processing</subject><subject>Image restoration</subject><subject>Light absorption</subject><subject>Mathematical models</subject><subject>Parameter estimation</subject><subject>Scattering</subject><subject>Task analysis</subject><subject>Training</subject><subject>Underwater</subject><subject>underwater image</subject><subject>zero reference</subject><issn>0364-9059</issn><issn>1558-1691</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE1PAjEQhhujiYjePXho4nmx06_dHhFBMUQSIxcvTXd3qqDsYruE-O8tgYOXmWTmeWeSh5BrYAMAZu6e5-MBZ1wMBJdKF_qE9ECpIgNt4JT0mNAyM0yZc3IR44oxkDI3PTJcNDWGnesw0OnafSAdN5-uqXCNTUfvXcSatg19x9Bmr-gxYNrRB8QNfcFu14avS3Lm3XfEq2Pvk8Vk_DZ6ymbzx-loOMsqbniXoeau8GUulTR17R1jUqTiEBTIWpY-N65kTqCotPJCG1dobSqoSqfyNBB9cnu4uwntzxZjZ1ftNjTppeWFyEEIMCpR7EBVoY0xoLebsFy78GuB2b0om0TZvSh7FJUiN4fIEhH_4cyALpT4Azp5Yzs</recordid><startdate>20230701</startdate><enddate>20230701</enddate><creator>Huang, Yifan</creator><creator>Yuan, Fei</creator><creator>Xiao, Fengqi</creator><creator>Lu, Jianxiang</creator><creator>Cheng, En</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>7TN</scope><scope>8FD</scope><scope>F1W</scope><scope>FR3</scope><scope>H96</scope><scope>JQ2</scope><scope>KR7</scope><scope>L.G</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-2121-6110</orcidid><orcidid>https://orcid.org/0000-0002-6883-9158</orcidid><orcidid>https://orcid.org/0000-0001-5721-3267</orcidid><orcidid>https://orcid.org/0000-0002-8614-8756</orcidid></search><sort><creationdate>20230701</creationdate><title>Underwater Image Enhancement Based on Zero-Reference Deep Network</title><author>Huang, Yifan ; Yuan, Fei ; Xiao, Fengqi ; Lu, Jianxiang ; Cheng, En</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c292t-e62a8fb74549ddfa0043a00ae1514d4bf79ab0a3e3c65f369a8669c1cba575f33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Attenuation</topic><topic>Benchmarks</topic><topic>Color</topic><topic>Colour</topic><topic>Data models</topic><topic>Deep learning</topic><topic>Deep network</topic><topic>Electromagnetic absorption</topic><topic>Haze</topic><topic>Image color analysis</topic><topic>Image contrast</topic><topic>Image enhancement</topic><topic>Image processing</topic><topic>Image restoration</topic><topic>Light absorption</topic><topic>Mathematical models</topic><topic>Parameter estimation</topic><topic>Scattering</topic><topic>Task analysis</topic><topic>Training</topic><topic>Underwater</topic><topic>underwater image</topic><topic>zero reference</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Huang, Yifan</creatorcontrib><creatorcontrib>Yuan, Fei</creatorcontrib><creatorcontrib>Xiao, Fengqi</creatorcontrib><creatorcontrib>Lu, Jianxiang</creatorcontrib><creatorcontrib>Cheng, En</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Oceanic Abstracts</collection><collection>Technology Research Database</collection><collection>ASFA: Aquatic Sciences and Fisheries Abstracts</collection><collection>Engineering Research Database</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy &amp; Non-Living Resources</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Aquatic Science &amp; Fisheries Abstracts (ASFA) Professional</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE journal of oceanic engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Huang, Yifan</au><au>Yuan, Fei</au><au>Xiao, Fengqi</au><au>Lu, Jianxiang</au><au>Cheng, En</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Underwater Image Enhancement Based on Zero-Reference Deep Network</atitle><jtitle>IEEE journal of oceanic engineering</jtitle><stitle>JOE</stitle><date>2023-07-01</date><risdate>2023</risdate><volume>48</volume><issue>3</issue><spage>1</spage><epage>22</epage><pages>1-22</pages><issn>0364-9059</issn><eissn>1558-1691</eissn><coden>IJOEDY</coden><abstract>Due to underwater light absorption and scattering, underwater images usually suffer from severe color attenuation and contrast reduction. Most mainstream underwater image processing methods based on deep learning require a large amount of underwater paired training data, leading to a complex network structure, longer training time, and higher computational cost. To address this problem, a novel Zero-Reference Deep Network for Underwater Image Enhancement (Zero-UIE) is proposed in this paper, which transforms the enhancement of an underwater image into a specific parameter map estimation by using a deep network. The underwater curve model based on the classical haze image formation principle is specially designed to remove underwater color dispersion and cast. A lightweight deep network is designed to estimate the dynamic adjustment parameters of the underwater curve model, and then adjust the dynamic range of the given image pixels according to the model. A set of non-reference loss functions are designed according to the characteristics of underwater images, which can implicitly drive the network learning. In addition, adaptive color compensation can be optionally used as the pre-processing step to further improve the robustness and visual performance. The significant contribution of the proposed method is zero reference, i.e., it does not require any paired or unpaired reference data for training. Extensive experiments on various benchmarks demonstrate that the proposed method is superior to state-of-the-art methods subjectively and objectively, which is competitive and applicable to diverse underwater conditions. Most importantly, it is an innovative exploration of zero reference for underwater image enhancement.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JOE.2023.3245686</doi><tpages>22</tpages><orcidid>https://orcid.org/0000-0003-2121-6110</orcidid><orcidid>https://orcid.org/0000-0002-6883-9158</orcidid><orcidid>https://orcid.org/0000-0001-5721-3267</orcidid><orcidid>https://orcid.org/0000-0002-8614-8756</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0364-9059
ispartof IEEE journal of oceanic engineering, 2023-07, Vol.48 (3), p.1-22
issn 0364-9059
1558-1691
language eng
recordid cdi_ieee_primary_10091685
source IEEE Electronic Library (IEL)
subjects Attenuation
Benchmarks
Color
Colour
Data models
Deep learning
Deep network
Electromagnetic absorption
Haze
Image color analysis
Image contrast
Image enhancement
Image processing
Image restoration
Light absorption
Mathematical models
Parameter estimation
Scattering
Task analysis
Training
Underwater
underwater image
zero reference
title Underwater Image Enhancement Based on Zero-Reference Deep Network
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T07%3A15%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Underwater%20Image%20Enhancement%20Based%20on%20Zero-Reference%20Deep%20Network&rft.jtitle=IEEE%20journal%20of%20oceanic%20engineering&rft.au=Huang,%20Yifan&rft.date=2023-07-01&rft.volume=48&rft.issue=3&rft.spage=1&rft.epage=22&rft.pages=1-22&rft.issn=0364-9059&rft.eissn=1558-1691&rft.coden=IJOEDY&rft_id=info:doi/10.1109/JOE.2023.3245686&rft_dat=%3Cproquest_RIE%3E2837133195%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2837133195&rft_id=info:pmid/&rft_ieee_id=10091685&rfr_iscdi=true