Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising
Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuris...
Gespeichert in:
Veröffentlicht in: | IEEE access 2024, Vol.12, p.127916-127930 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 127930 |
---|---|
container_issue | |
container_start_page | 127916 |
container_title | IEEE access |
container_volume | 12 |
creator | Bo, Li Junrui, Lv Xuegang, Luo |
description | Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuristic weight selection, neglecting the potential of automated strategies. This work introduces a novel non-convex, non-separable regularization term aimed at achieving a hybrid representation that leverages both low-rank (LR) and global sparse gradient (GS) structures. An iteratively auto-weighting Equivalent Minimax L{\gamma } Concave penalty (EMLC) is proposed for non-convex relaxations. To enhance sparsity and improve low-rank estimation, the EMLC-LRGS-based image denoising model is presented. This model integrates global gradient sparsity and LR priors within a unified framework using the EMLC penalty. The formulation addresses limitations of convex relaxations by employing an equivalent representation of the weight minimax L{\gamma } concave penalty as a combined global sparsity and local smoothness regularizer in the gradient domain. This aligns more closely with the data acquisition model and prior knowledge. To exploit the inherent low-rank structure of images, an equivalent representation of the weighted L{\gamma } norm is employed as a low-rank regularization term applied to groups of similar image patches. Efficient model resolution is achieved through an adaptive alternating direction method of multipliers (ADMM) algorithm that dynamically tunes the weighted parameter while promoting sparsity and a low-rank representation. The effectiveness of this approach is demonstrated through comprehensive comparisons with state-of-the-art image denoising models, showcasing its superiority in image denoising tasks. |
doi_str_mv | 10.1109/ACCESS.2024.3442373 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_ACCESS_2024_3442373</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10634185</ieee_id><doaj_id>oai_doaj_org_article_3038f56a940d436f9511857eb24f6967</doaj_id><sourcerecordid>3106514665</sourcerecordid><originalsourceid>FETCH-LOGICAL-c289t-fa10e8229112e06a01d72e91e5a64d9233a64e5d3c0558bf7fd22ee9d1a8c8933</originalsourceid><addsrcrecordid>eNpNUdtOGzEQXVWtVAR8AX2w1OdNfd_1Y7SlECmFihTxaDm74-A02Fvboc139T_4ppouqpiXGR3NOXNGp6rOCJ4RgtWnededr1YziimfMc4pa9ib6ogSqWommHz7an5fnaa0xaXaAonmqBqvwiPs0OVhHd2AVqOJCZDxA1qGX_WN8T_QDYwREvhssgse3bl8j-b7HOo7cJv7jL467x7Mb7R8-oO64HvzCOgbeLPLB2RDRIsHswH0GXxwyfnNSfXOml2C05d-XN1-Of_eXdbL64tFN1_WPW1Vrq0hGFpKFSEUsDSYDA0FRUAYyQdFGSsdxMB6LES7to0dKAVQAzFt3yrGjqvFpDsEs9VjLB7jQQfj9D8gxI02Mbt-B5ph1lohjeJ44ExaJQhpRQNryq1UsilaHyetMYafe0hZb8M-lheTZgRLQbiUomyxaauPIaUI9v9VgvVzUnpKSj8npV-SKqwPE8sBwCuGZLyYYH8B9oWOiQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3106514665</pqid></control><display><type>article</type><title>Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Bo, Li ; Junrui, Lv ; Xuegang, Luo</creator><creatorcontrib>Bo, Li ; Junrui, Lv ; Xuegang, Luo</creatorcontrib><description><![CDATA[Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuristic weight selection, neglecting the potential of automated strategies. This work introduces a novel non-convex, non-separable regularization term aimed at achieving a hybrid representation that leverages both low-rank (LR) and global sparse gradient (GS) structures. An iteratively auto-weighting Equivalent Minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> Concave penalty (EMLC) is proposed for non-convex relaxations. To enhance sparsity and improve low-rank estimation, the EMLC-LRGS-based image denoising model is presented. This model integrates global gradient sparsity and LR priors within a unified framework using the EMLC penalty. The formulation addresses limitations of convex relaxations by employing an equivalent representation of the weight minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> concave penalty as a combined global sparsity and local smoothness regularizer in the gradient domain. This aligns more closely with the data acquisition model and prior knowledge. To exploit the inherent low-rank structure of images, an equivalent representation of the weighted <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> norm is employed as a low-rank regularization term applied to groups of similar image patches. Efficient model resolution is achieved through an adaptive alternating direction method of multipliers (ADMM) algorithm that dynamically tunes the weighted parameter while promoting sparsity and a low-rank representation. The effectiveness of this approach is demonstrated through comprehensive comparisons with state-of-the-art image denoising models, showcasing its superiority in image denoising tasks.]]></description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2024.3442373</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Adaptive algorithms ; Data acquisition ; Equivalence ; Image acquisition ; Image denoising ; Image enhancement ; Knowledge representation ; Mathematical models ; minimax Lγ concave penalty ; Minimax technique ; Minimization ; Noise reduction ; non-convex regularization ; Optimization ; Regularization ; Smoothness ; sparse and low-rank representation ; Sparse approximation ; Sparsity ; Task analysis</subject><ispartof>IEEE access, 2024, Vol.12, p.127916-127930</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c289t-fa10e8229112e06a01d72e91e5a64d9233a64e5d3c0558bf7fd22ee9d1a8c8933</cites><orcidid>0000-0001-8240-5199 ; 0000-0002-6796-3819</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10634185$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2102,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Bo, Li</creatorcontrib><creatorcontrib>Junrui, Lv</creatorcontrib><creatorcontrib>Xuegang, Luo</creatorcontrib><title>Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising</title><title>IEEE access</title><addtitle>Access</addtitle><description><![CDATA[Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuristic weight selection, neglecting the potential of automated strategies. This work introduces a novel non-convex, non-separable regularization term aimed at achieving a hybrid representation that leverages both low-rank (LR) and global sparse gradient (GS) structures. An iteratively auto-weighting Equivalent Minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> Concave penalty (EMLC) is proposed for non-convex relaxations. To enhance sparsity and improve low-rank estimation, the EMLC-LRGS-based image denoising model is presented. This model integrates global gradient sparsity and LR priors within a unified framework using the EMLC penalty. The formulation addresses limitations of convex relaxations by employing an equivalent representation of the weight minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> concave penalty as a combined global sparsity and local smoothness regularizer in the gradient domain. This aligns more closely with the data acquisition model and prior knowledge. To exploit the inherent low-rank structure of images, an equivalent representation of the weighted <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> norm is employed as a low-rank regularization term applied to groups of similar image patches. Efficient model resolution is achieved through an adaptive alternating direction method of multipliers (ADMM) algorithm that dynamically tunes the weighted parameter while promoting sparsity and a low-rank representation. The effectiveness of this approach is demonstrated through comprehensive comparisons with state-of-the-art image denoising models, showcasing its superiority in image denoising tasks.]]></description><subject>Adaptive algorithms</subject><subject>Data acquisition</subject><subject>Equivalence</subject><subject>Image acquisition</subject><subject>Image denoising</subject><subject>Image enhancement</subject><subject>Knowledge representation</subject><subject>Mathematical models</subject><subject>minimax Lγ concave penalty</subject><subject>Minimax technique</subject><subject>Minimization</subject><subject>Noise reduction</subject><subject>non-convex regularization</subject><subject>Optimization</subject><subject>Regularization</subject><subject>Smoothness</subject><subject>sparse and low-rank representation</subject><subject>Sparse approximation</subject><subject>Sparsity</subject><subject>Task analysis</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUdtOGzEQXVWtVAR8AX2w1OdNfd_1Y7SlECmFihTxaDm74-A02Fvboc139T_4ppouqpiXGR3NOXNGp6rOCJ4RgtWnededr1YziimfMc4pa9ib6ogSqWommHz7an5fnaa0xaXaAonmqBqvwiPs0OVhHd2AVqOJCZDxA1qGX_WN8T_QDYwREvhssgse3bl8j-b7HOo7cJv7jL467x7Mb7R8-oO64HvzCOgbeLPLB2RDRIsHswH0GXxwyfnNSfXOml2C05d-XN1-Of_eXdbL64tFN1_WPW1Vrq0hGFpKFSEUsDSYDA0FRUAYyQdFGSsdxMB6LES7to0dKAVQAzFt3yrGjqvFpDsEs9VjLB7jQQfj9D8gxI02Mbt-B5ph1lohjeJ44ExaJQhpRQNryq1UsilaHyetMYafe0hZb8M-lheTZgRLQbiUomyxaauPIaUI9v9VgvVzUnpKSj8npV-SKqwPE8sBwCuGZLyYYH8B9oWOiQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Bo, Li</creator><creator>Junrui, Lv</creator><creator>Xuegang, Luo</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-8240-5199</orcidid><orcidid>https://orcid.org/0000-0002-6796-3819</orcidid></search><sort><creationdate>2024</creationdate><title>Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising</title><author>Bo, Li ; Junrui, Lv ; Xuegang, Luo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c289t-fa10e8229112e06a01d72e91e5a64d9233a64e5d3c0558bf7fd22ee9d1a8c8933</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptive algorithms</topic><topic>Data acquisition</topic><topic>Equivalence</topic><topic>Image acquisition</topic><topic>Image denoising</topic><topic>Image enhancement</topic><topic>Knowledge representation</topic><topic>Mathematical models</topic><topic>minimax Lγ concave penalty</topic><topic>Minimax technique</topic><topic>Minimization</topic><topic>Noise reduction</topic><topic>non-convex regularization</topic><topic>Optimization</topic><topic>Regularization</topic><topic>Smoothness</topic><topic>sparse and low-rank representation</topic><topic>Sparse approximation</topic><topic>Sparsity</topic><topic>Task analysis</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Bo, Li</creatorcontrib><creatorcontrib>Junrui, Lv</creatorcontrib><creatorcontrib>Xuegang, Luo</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bo, Li</au><au>Junrui, Lv</au><au>Xuegang, Luo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2024</date><risdate>2024</risdate><volume>12</volume><spage>127916</spage><epage>127930</epage><pages>127916-127930</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract><![CDATA[Image denoising techniques often rely on convex relaxations, which can introduce bias into estimations. To address this, non-convex regularizers like weighted nuclear norm minimization and weighted Schatten p-norm minimization have been proposed. However, current implementations often rely on heuristic weight selection, neglecting the potential of automated strategies. This work introduces a novel non-convex, non-separable regularization term aimed at achieving a hybrid representation that leverages both low-rank (LR) and global sparse gradient (GS) structures. An iteratively auto-weighting Equivalent Minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> Concave penalty (EMLC) is proposed for non-convex relaxations. To enhance sparsity and improve low-rank estimation, the EMLC-LRGS-based image denoising model is presented. This model integrates global gradient sparsity and LR priors within a unified framework using the EMLC penalty. The formulation addresses limitations of convex relaxations by employing an equivalent representation of the weight minimax <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> concave penalty as a combined global sparsity and local smoothness regularizer in the gradient domain. This aligns more closely with the data acquisition model and prior knowledge. To exploit the inherent low-rank structure of images, an equivalent representation of the weighted <inline-formula> <tex-math notation="LaTeX">L{\gamma } </tex-math></inline-formula> norm is employed as a low-rank regularization term applied to groups of similar image patches. Efficient model resolution is achieved through an adaptive alternating direction method of multipliers (ADMM) algorithm that dynamically tunes the weighted parameter while promoting sparsity and a low-rank representation. The effectiveness of this approach is demonstrated through comprehensive comparisons with state-of-the-art image denoising models, showcasing its superiority in image denoising tasks.]]></abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2024.3442373</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0001-8240-5199</orcidid><orcidid>https://orcid.org/0000-0002-6796-3819</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2169-3536 |
ispartof | IEEE access, 2024, Vol.12, p.127916-127930 |
issn | 2169-3536 2169-3536 |
language | eng |
recordid | cdi_crossref_primary_10_1109_ACCESS_2024_3442373 |
source | IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals |
subjects | Adaptive algorithms Data acquisition Equivalence Image acquisition Image denoising Image enhancement Knowledge representation Mathematical models minimax Lγ concave penalty Minimax technique Minimization Noise reduction non-convex regularization Optimization Regularization Smoothness sparse and low-rank representation Sparse approximation Sparsity Task analysis |
title | Novel Hybrid Sparse and Low-Rank Representation With Auto-Weight Minimax Lγ Concave Penalty for Image Denoising |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T03%3A44%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Novel%20Hybrid%20Sparse%20and%20Low-Rank%20Representation%20With%20Auto-Weight%20Minimax%20L%CE%B3%20Concave%20Penalty%20for%20Image%20Denoising&rft.jtitle=IEEE%20access&rft.au=Bo,%20Li&rft.date=2024&rft.volume=12&rft.spage=127916&rft.epage=127930&rft.pages=127916-127930&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2024.3442373&rft_dat=%3Cproquest_cross%3E3106514665%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3106514665&rft_id=info:pmid/&rft_ieee_id=10634185&rft_doaj_id=oai_doaj_org_article_3038f56a940d436f9511857eb24f6967&rfr_iscdi=true |