Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations
A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to und...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2018-01, Vol.29 (1), p.218-224 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 224 |
---|---|
container_issue | 1 |
container_start_page | 218 |
container_title | IEEE transaction on neural networks and learning systems |
container_volume | 29 |
creator | Peng, Xi Lu, Canyi Yi, Zhang Tang, Huajin |
description | A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to understand its working mechanism. In this brief, we fill this gap by building the theoretical connections between FNR and NNR. More specially, we prove that: 1) when the dictionary can provide enough representative capacity, FNR is exactly NNR even though the data set contains the Gaussian noise, Laplacian noise, or sample-specified corruption and 2) otherwise, FNR and NNR are two solutions on the column space of the dictionary. |
doi_str_mv | 10.1109/TNNLS.2016.2608834 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pubmed_primary_27723605</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>7585081</ieee_id><sourcerecordid>1984595295</sourcerecordid><originalsourceid>FETCH-LOGICAL-c351t-9b11d2f5db711116e0e302a59f8ee0ecd2462756c6b896efd08d656854d2812c3</originalsourceid><addsrcrecordid>eNpdkF1LwzAUhoMobsz9AQUpeONNZ5I2aXLphtPBqKATvAttcgodXTqTFvHfm324C89Nzkme9xAehK4JnhCC5cMqz5fvE4oJn1COhUjSMzSkhNOYJkKcn_rsc4DG3q9xKI4ZT-UlGtAso0mYhmgxa60F3dWt9dEUum8AG-W9bqBwcd66TVRYE81dW4Kte7-_iqeFBxO9wdaBB9sV-_QVuqiKxsP4eI7Qx_xpNXuJl6_Pi9njMtYJI10sS0IMrZgpMxKKA4YE04LJSkDotaEppxnjmpdCcqgMFoYzLlhqqCBUJyN0f9i7de1XD75Tm9praJrCQtt7RUTCUix5SgJ69w9dt72z4XeKSJEyyahkgaIHSrvWeweV2rp6U7gfRbDauVZ712rnWh1dh9DtcXVfbsCcIn9mA3BzAGoAOD1nTDAsSPILxJiBcQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1984595295</pqid></control><display><type>article</type><title>Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations</title><source>IEEE Electronic Library (IEL)</source><creator>Peng, Xi ; Lu, Canyi ; Yi, Zhang ; Tang, Huajin</creator><creatorcontrib>Peng, Xi ; Lu, Canyi ; Yi, Zhang ; Tang, Huajin</creatorcontrib><description>A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to understand its working mechanism. In this brief, we fill this gap by building the theoretical connections between FNR and NNR. More specially, we prove that: 1) when the dictionary can provide enough representative capacity, FNR is exactly NNR even though the data set contains the Gaussian noise, Laplacian noise, or sample-specified corruption and 2) otherwise, FNR and NNR are two solutions on the column space of the dictionary.</description><identifier>ISSN: 2162-237X</identifier><identifier>EISSN: 2162-2388</identifier><identifier>DOI: 10.1109/TNNLS.2016.2608834</identifier><identifier>PMID: 27723605</identifier><identifier>CODEN: ITNNAL</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Artificial neural networks ; Clustering ; Corruption ; Dictionaries ; equivalence ; Gaussian noise ; Laplace equations ; Learning systems ; least square regression ; Linear programming ; low rank representation (LRR) ; Minimization ; Noise ; rank minimization ; Representations ; Theoretical analysis ; ℓ₂-minimization</subject><ispartof>IEEE transaction on neural networks and learning systems, 2018-01, Vol.29 (1), p.218-224</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c351t-9b11d2f5db711116e0e302a59f8ee0ecd2462756c6b896efd08d656854d2812c3</citedby><cites>FETCH-LOGICAL-c351t-9b11d2f5db711116e0e302a59f8ee0ecd2462756c6b896efd08d656854d2812c3</cites><orcidid>0000-0002-9857-1543 ; 0000-0002-5727-2790</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/7585081$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/7585081$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/27723605$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Peng, Xi</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Yi, Zhang</creatorcontrib><creatorcontrib>Tang, Huajin</creatorcontrib><title>Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations</title><title>IEEE transaction on neural networks and learning systems</title><addtitle>TNNLS</addtitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><description>A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to understand its working mechanism. In this brief, we fill this gap by building the theoretical connections between FNR and NNR. More specially, we prove that: 1) when the dictionary can provide enough representative capacity, FNR is exactly NNR even though the data set contains the Gaussian noise, Laplacian noise, or sample-specified corruption and 2) otherwise, FNR and NNR are two solutions on the column space of the dictionary.</description><subject>Artificial neural networks</subject><subject>Clustering</subject><subject>Corruption</subject><subject>Dictionaries</subject><subject>equivalence</subject><subject>Gaussian noise</subject><subject>Laplace equations</subject><subject>Learning systems</subject><subject>least square regression</subject><subject>Linear programming</subject><subject>low rank representation (LRR)</subject><subject>Minimization</subject><subject>Noise</subject><subject>rank minimization</subject><subject>Representations</subject><subject>Theoretical analysis</subject><subject>ℓ₂-minimization</subject><issn>2162-237X</issn><issn>2162-2388</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpdkF1LwzAUhoMobsz9AQUpeONNZ5I2aXLphtPBqKATvAttcgodXTqTFvHfm324C89Nzkme9xAehK4JnhCC5cMqz5fvE4oJn1COhUjSMzSkhNOYJkKcn_rsc4DG3q9xKI4ZT-UlGtAso0mYhmgxa60F3dWt9dEUum8AG-W9bqBwcd66TVRYE81dW4Kte7-_iqeFBxO9wdaBB9sV-_QVuqiKxsP4eI7Qx_xpNXuJl6_Pi9njMtYJI10sS0IMrZgpMxKKA4YE04LJSkDotaEppxnjmpdCcqgMFoYzLlhqqCBUJyN0f9i7de1XD75Tm9praJrCQtt7RUTCUix5SgJ69w9dt72z4XeKSJEyyahkgaIHSrvWeweV2rp6U7gfRbDauVZ712rnWh1dh9DtcXVfbsCcIn9mA3BzAGoAOD1nTDAsSPILxJiBcQ</recordid><startdate>201801</startdate><enddate>201801</enddate><creator>Peng, Xi</creator><creator>Lu, Canyi</creator><creator>Yi, Zhang</creator><creator>Tang, Huajin</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QP</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-9857-1543</orcidid><orcidid>https://orcid.org/0000-0002-5727-2790</orcidid></search><sort><creationdate>201801</creationdate><title>Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations</title><author>Peng, Xi ; Lu, Canyi ; Yi, Zhang ; Tang, Huajin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c351t-9b11d2f5db711116e0e302a59f8ee0ecd2462756c6b896efd08d656854d2812c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Artificial neural networks</topic><topic>Clustering</topic><topic>Corruption</topic><topic>Dictionaries</topic><topic>equivalence</topic><topic>Gaussian noise</topic><topic>Laplace equations</topic><topic>Learning systems</topic><topic>least square regression</topic><topic>Linear programming</topic><topic>low rank representation (LRR)</topic><topic>Minimization</topic><topic>Noise</topic><topic>rank minimization</topic><topic>Representations</topic><topic>Theoretical analysis</topic><topic>ℓ₂-minimization</topic><toplevel>online_resources</toplevel><creatorcontrib>Peng, Xi</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Yi, Zhang</creatorcontrib><creatorcontrib>Tang, Huajin</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Calcium & Calcified Tissue Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transaction on neural networks and learning systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Peng, Xi</au><au>Lu, Canyi</au><au>Yi, Zhang</au><au>Tang, Huajin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations</atitle><jtitle>IEEE transaction on neural networks and learning systems</jtitle><stitle>TNNLS</stitle><addtitle>IEEE Trans Neural Netw Learn Syst</addtitle><date>2018-01</date><risdate>2018</risdate><volume>29</volume><issue>1</issue><spage>218</spage><epage>224</epage><pages>218-224</pages><issn>2162-237X</issn><eissn>2162-2388</eissn><coden>ITNNAL</coden><abstract>A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to understand its working mechanism. In this brief, we fill this gap by building the theoretical connections between FNR and NNR. More specially, we prove that: 1) when the dictionary can provide enough representative capacity, FNR is exactly NNR even though the data set contains the Gaussian noise, Laplacian noise, or sample-specified corruption and 2) otherwise, FNR and NNR are two solutions on the column space of the dictionary.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>27723605</pmid><doi>10.1109/TNNLS.2016.2608834</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0002-9857-1543</orcidid><orcidid>https://orcid.org/0000-0002-5727-2790</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2162-237X |
ispartof | IEEE transaction on neural networks and learning systems, 2018-01, Vol.29 (1), p.218-224 |
issn | 2162-237X 2162-2388 |
language | eng |
recordid | cdi_pubmed_primary_27723605 |
source | IEEE Electronic Library (IEL) |
subjects | Artificial neural networks Clustering Corruption Dictionaries equivalence Gaussian noise Laplace equations Learning systems least square regression Linear programming low rank representation (LRR) Minimization Noise rank minimization Representations Theoretical analysis ℓ₂-minimization |
title | Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T23%3A14%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Connections%20Between%20Nuclear-Norm%20and%20Frobenius-Norm-Based%20Representations&rft.jtitle=IEEE%20transaction%20on%20neural%20networks%20and%20learning%20systems&rft.au=Peng,%20Xi&rft.date=2018-01&rft.volume=29&rft.issue=1&rft.spage=218&rft.epage=224&rft.pages=218-224&rft.issn=2162-237X&rft.eissn=2162-2388&rft.coden=ITNNAL&rft_id=info:doi/10.1109/TNNLS.2016.2608834&rft_dat=%3Cproquest_RIE%3E1984595295%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1984595295&rft_id=info:pmid/27723605&rft_ieee_id=7585081&rfr_iscdi=true |