Tensor Q-rank: new data dependent definition of tensor rank

Recently, the Tensor Nuclear Norm ( T N N ) regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along the third dimension to ensure their low rank structures. In this paper, we propose a new d...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Machine learning 2021-07, Vol.110 (7), p.1867-1900
Hauptverfasser: Kong, Hao, Lu, Canyi, Lin, Zhouchen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1900
container_issue 7
container_start_page 1867
container_title Machine learning
container_volume 110
creator Kong, Hao
Lu, Canyi
Lin, Zhouchen
description Recently, the Tensor Nuclear Norm ( T N N ) regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along the third dimension to ensure their low rank structures. In this paper, we propose a new definition of data dependent tensor rank named tensor Q-rank by a learnable orthogonal matrix Q , and further introduce a unified data dependent low rank tensor recovery model. According to the low rank hypothesis, we introduce two explainable selection methods of Q , under which the data tensor may have a more significant low tensor Q-rank structure than that of low tubal-rank structure. Specifically, maximizing the variance of singular value distribution leads to Variance Maximization Tensor Q-Nuclear norm (VMTQN), while minimizing the value of nuclear norm through manifold optimization leads to Manifold Optimization Tensor Q-Nuclear norm (MOTQN). Moreover, we apply these two models to the low rank tensor completion problem, and then give an effective algorithm and briefly analyze why our method works better than TNN based methods in the case of complex data with low sampling rate. Finally, experimental results on real-world datasets demonstrate the superiority of our proposed models in the tensor completion problem with respect to other tensor rank regularization models.
doi_str_mv 10.1007/s10994-021-05987-8
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2547167172</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2547167172</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-29afddd5e228451992b7e163105d53b047d5162b8515ef3ff0bc0b7023ff71173</originalsourceid><addsrcrecordid>eNp9kE1LxDAQhoMoWFf_gKeC5-hM0nxUT7L4BQsirOfQNol01bQmXcR_b9YK3jzNO_C8M_AQcopwjgDqIiHUdUWBIQVRa0X1HilQKJ5XKfZJAVoLKpGJQ3KU0gYAmNSyIFdrF9IQyycam_B6WQb3WdpmakrrRhesC1NOvg_91A-hHHw5zfyOPiYHvnlL7uR3Lsjz7c16eU9Xj3cPy-sV7bjkE2V14621wjGmK4F1zVrlUHIEYQVvoVJWoGStFiic595D20GrgOWoEBVfkLP57hiHj61Lk9kM2xjyS8NEpVAqVCxTbKa6OKQUnTdj7N-b-GUQzE6SmSWZLMn8SDI6l_hcShkOLy7-nf6n9Q1w7mg5</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2547167172</pqid></control><display><type>article</type><title>Tensor Q-rank: new data dependent definition of tensor rank</title><source>SpringerLink Journals - AutoHoldings</source><creator>Kong, Hao ; Lu, Canyi ; Lin, Zhouchen</creator><creatorcontrib>Kong, Hao ; Lu, Canyi ; Lin, Zhouchen</creatorcontrib><description>Recently, the Tensor Nuclear Norm ( T N N ) regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along the third dimension to ensure their low rank structures. In this paper, we propose a new definition of data dependent tensor rank named tensor Q-rank by a learnable orthogonal matrix Q , and further introduce a unified data dependent low rank tensor recovery model. According to the low rank hypothesis, we introduce two explainable selection methods of Q , under which the data tensor may have a more significant low tensor Q-rank structure than that of low tubal-rank structure. Specifically, maximizing the variance of singular value distribution leads to Variance Maximization Tensor Q-Nuclear norm (VMTQN), while minimizing the value of nuclear norm through manifold optimization leads to Manifold Optimization Tensor Q-Nuclear norm (MOTQN). Moreover, we apply these two models to the low rank tensor completion problem, and then give an effective algorithm and briefly analyze why our method works better than TNN based methods in the case of complex data with low sampling rate. Finally, experimental results on real-world datasets demonstrate the superiority of our proposed models in the tensor completion problem with respect to other tensor rank regularization models.</description><identifier>ISSN: 0885-6125</identifier><identifier>EISSN: 1573-0565</identifier><identifier>DOI: 10.1007/s10994-021-05987-8</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Artificial Intelligence ; Computer Science ; Control ; Decomposition ; Fourier transforms ; Machine Learning ; Mathematical analysis ; Maximization ; Mechatronics ; Natural Language Processing (NLP) ; Optimization ; Regularization ; Robotics ; Simulation and Modeling ; Tensors ; Time series</subject><ispartof>Machine learning, 2021-07, Vol.110 (7), p.1867-1900</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2021</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media LLC, part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-29afddd5e228451992b7e163105d53b047d5162b8515ef3ff0bc0b7023ff71173</citedby><cites>FETCH-LOGICAL-c363t-29afddd5e228451992b7e163105d53b047d5162b8515ef3ff0bc0b7023ff71173</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10994-021-05987-8$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10994-021-05987-8$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Kong, Hao</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Lin, Zhouchen</creatorcontrib><title>Tensor Q-rank: new data dependent definition of tensor rank</title><title>Machine learning</title><addtitle>Mach Learn</addtitle><description>Recently, the Tensor Nuclear Norm ( T N N ) regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along the third dimension to ensure their low rank structures. In this paper, we propose a new definition of data dependent tensor rank named tensor Q-rank by a learnable orthogonal matrix Q , and further introduce a unified data dependent low rank tensor recovery model. According to the low rank hypothesis, we introduce two explainable selection methods of Q , under which the data tensor may have a more significant low tensor Q-rank structure than that of low tubal-rank structure. Specifically, maximizing the variance of singular value distribution leads to Variance Maximization Tensor Q-Nuclear norm (VMTQN), while minimizing the value of nuclear norm through manifold optimization leads to Manifold Optimization Tensor Q-Nuclear norm (MOTQN). Moreover, we apply these two models to the low rank tensor completion problem, and then give an effective algorithm and briefly analyze why our method works better than TNN based methods in the case of complex data with low sampling rate. Finally, experimental results on real-world datasets demonstrate the superiority of our proposed models in the tensor completion problem with respect to other tensor rank regularization models.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Control</subject><subject>Decomposition</subject><subject>Fourier transforms</subject><subject>Machine Learning</subject><subject>Mathematical analysis</subject><subject>Maximization</subject><subject>Mechatronics</subject><subject>Natural Language Processing (NLP)</subject><subject>Optimization</subject><subject>Regularization</subject><subject>Robotics</subject><subject>Simulation and Modeling</subject><subject>Tensors</subject><subject>Time series</subject><issn>0885-6125</issn><issn>1573-0565</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE1LxDAQhoMoWFf_gKeC5-hM0nxUT7L4BQsirOfQNol01bQmXcR_b9YK3jzNO_C8M_AQcopwjgDqIiHUdUWBIQVRa0X1HilQKJ5XKfZJAVoLKpGJQ3KU0gYAmNSyIFdrF9IQyycam_B6WQb3WdpmakrrRhesC1NOvg_91A-hHHw5zfyOPiYHvnlL7uR3Lsjz7c16eU9Xj3cPy-sV7bjkE2V14621wjGmK4F1zVrlUHIEYQVvoVJWoGStFiic595D20GrgOWoEBVfkLP57hiHj61Lk9kM2xjyS8NEpVAqVCxTbKa6OKQUnTdj7N-b-GUQzE6SmSWZLMn8SDI6l_hcShkOLy7-nf6n9Q1w7mg5</recordid><startdate>20210701</startdate><enddate>20210701</enddate><creator>Kong, Hao</creator><creator>Lu, Canyi</creator><creator>Lin, Zhouchen</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>88I</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M2P</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope></search><sort><creationdate>20210701</creationdate><title>Tensor Q-rank: new data dependent definition of tensor rank</title><author>Kong, Hao ; Lu, Canyi ; Lin, Zhouchen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-29afddd5e228451992b7e163105d53b047d5162b8515ef3ff0bc0b7023ff71173</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Control</topic><topic>Decomposition</topic><topic>Fourier transforms</topic><topic>Machine Learning</topic><topic>Mathematical analysis</topic><topic>Maximization</topic><topic>Mechatronics</topic><topic>Natural Language Processing (NLP)</topic><topic>Optimization</topic><topic>Regularization</topic><topic>Robotics</topic><topic>Simulation and Modeling</topic><topic>Tensors</topic><topic>Time series</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kong, Hao</creatorcontrib><creatorcontrib>Lu, Canyi</creatorcontrib><creatorcontrib>Lin, Zhouchen</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Science Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Machine learning</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kong, Hao</au><au>Lu, Canyi</au><au>Lin, Zhouchen</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Tensor Q-rank: new data dependent definition of tensor rank</atitle><jtitle>Machine learning</jtitle><stitle>Mach Learn</stitle><date>2021-07-01</date><risdate>2021</risdate><volume>110</volume><issue>7</issue><spage>1867</spage><epage>1900</epage><pages>1867-1900</pages><issn>0885-6125</issn><eissn>1573-0565</eissn><abstract>Recently, the Tensor Nuclear Norm ( T N N ) regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along the third dimension to ensure their low rank structures. In this paper, we propose a new definition of data dependent tensor rank named tensor Q-rank by a learnable orthogonal matrix Q , and further introduce a unified data dependent low rank tensor recovery model. According to the low rank hypothesis, we introduce two explainable selection methods of Q , under which the data tensor may have a more significant low tensor Q-rank structure than that of low tubal-rank structure. Specifically, maximizing the variance of singular value distribution leads to Variance Maximization Tensor Q-Nuclear norm (VMTQN), while minimizing the value of nuclear norm through manifold optimization leads to Manifold Optimization Tensor Q-Nuclear norm (MOTQN). Moreover, we apply these two models to the low rank tensor completion problem, and then give an effective algorithm and briefly analyze why our method works better than TNN based methods in the case of complex data with low sampling rate. Finally, experimental results on real-world datasets demonstrate the superiority of our proposed models in the tensor completion problem with respect to other tensor rank regularization models.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10994-021-05987-8</doi><tpages>34</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0885-6125
ispartof Machine learning, 2021-07, Vol.110 (7), p.1867-1900
issn 0885-6125
1573-0565
language eng
recordid cdi_proquest_journals_2547167172
source SpringerLink Journals - AutoHoldings
subjects Algorithms
Artificial Intelligence
Computer Science
Control
Decomposition
Fourier transforms
Machine Learning
Mathematical analysis
Maximization
Mechatronics
Natural Language Processing (NLP)
Optimization
Regularization
Robotics
Simulation and Modeling
Tensors
Time series
title Tensor Q-rank: new data dependent definition of tensor rank
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T22%3A07%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Tensor%20Q-rank:%20new%20data%20dependent%20definition%20of%20tensor%20rank&rft.jtitle=Machine%20learning&rft.au=Kong,%20Hao&rft.date=2021-07-01&rft.volume=110&rft.issue=7&rft.spage=1867&rft.epage=1900&rft.pages=1867-1900&rft.issn=0885-6125&rft.eissn=1573-0565&rft_id=info:doi/10.1007/s10994-021-05987-8&rft_dat=%3Cproquest_cross%3E2547167172%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2547167172&rft_id=info:pmid/&rfr_iscdi=true