An optimal statistical and computational framework for generalized tensor estimation

This paper describes a flexible framework for generalized low-rank tensor estimation problems that includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Annals of statistics 2022-02, Vol.50 (1), p.1
Hauptverfasser: Han, Rungang, Willett, Rebecca, Zhang, Anru R.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 1
container_start_page 1
container_title The Annals of statistics
container_volume 50
creator Han, Rungang
Willett, Rebecca
Zhang, Anru R.
description This paper describes a flexible framework for generalized low-rank tensor estimation problems that includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under generalized parametric models. To overcome the difficulty of nonconvexity in these problems, we introduce a unified approach of projected gradient descent that adapts to the underlying low-rank structure. Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis. Then we further consider a suite of generalized tensor estimation problems, including sub-Gaussian tensor PCA, tensor regression, and Poisson and binomial tensor PCA. We prove that the proposed algorithm achieves the minimax optimal rate of convergence in estimation error. Finally, we demonstrate the superiority of the proposed framework via extensive experiments on both simulated and real data.
doi_str_mv 10.1214/21-AOS2061
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2633263582</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2633263582</sourcerecordid><originalsourceid>FETCH-LOGICAL-c259t-24331aff5ca66164209e1c5a2992b36c787b5c69e257f2ba61a7125a157c8a5c3</originalsourceid><addsrcrecordid>eNotkE9LAzEQxYMoWKsXP8GCN2E1M9lkN8dS_AeFHqznME0T2dpu1mSL6Kc3S3sYZvjxePN4jN0CfwCE6hGhnC3fkSs4YxME1ZSNVuqcTTjXvJRCVZfsKqUt51zqSkzYatYVoR_aPe2KNNDQpqG1-aZuU9iw7w8jC10mPtLe_YT4VfgQi0_XuUi79s9tisF1KSOXRptRfc0uPO2SuzntKft4flrNX8vF8uVtPluUFqUeSqyEAPJeWlIKVIVcO7CSUGtcC2Xrpl5Lq7RDWXtckwKqASWBrG1D0oopuzv69jF8H_J_sw2HmMMmg0qIPLLBrLo_qmwMKUXnTR9z0PhrgJuxNYNgTq2Jfy9tX5E</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2633263582</pqid></control><display><type>article</type><title>An optimal statistical and computational framework for generalized tensor estimation</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>Project Euclid Complete</source><creator>Han, Rungang ; Willett, Rebecca ; Zhang, Anru R.</creator><creatorcontrib>Han, Rungang ; Willett, Rebecca ; Zhang, Anru R.</creatorcontrib><description>This paper describes a flexible framework for generalized low-rank tensor estimation problems that includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under generalized parametric models. To overcome the difficulty of nonconvexity in these problems, we introduce a unified approach of projected gradient descent that adapts to the underlying low-rank structure. Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis. Then we further consider a suite of generalized tensor estimation problems, including sub-Gaussian tensor PCA, tensor regression, and Poisson and binomial tensor PCA. We prove that the proposed algorithm achieves the minimax optimal rate of convergence in estimation error. Finally, we demonstrate the superiority of the proposed framework via extensive experiments on both simulated and real data.</description><identifier>ISSN: 0090-5364</identifier><identifier>EISSN: 2168-8966</identifier><identifier>DOI: 10.1214/21-AOS2061</identifier><language>eng</language><publisher>Hayward: Institute of Mathematical Statistics</publisher><subject>Algorithms ; Convergence ; Estimating techniques ; Fluid dynamics ; Linear equations ; Mathematical analysis ; Mathematics ; Minimax technique ; Network analysis ; Statistical analysis ; Tensors ; Upper bounds</subject><ispartof>The Annals of statistics, 2022-02, Vol.50 (1), p.1</ispartof><rights>Copyright Institute of Mathematical Statistics Feb 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c259t-24331aff5ca66164209e1c5a2992b36c787b5c69e257f2ba61a7125a157c8a5c3</citedby><cites>FETCH-LOGICAL-c259t-24331aff5ca66164209e1c5a2992b36c787b5c69e257f2ba61a7125a157c8a5c3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids></links><search><creatorcontrib>Han, Rungang</creatorcontrib><creatorcontrib>Willett, Rebecca</creatorcontrib><creatorcontrib>Zhang, Anru R.</creatorcontrib><title>An optimal statistical and computational framework for generalized tensor estimation</title><title>The Annals of statistics</title><description>This paper describes a flexible framework for generalized low-rank tensor estimation problems that includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under generalized parametric models. To overcome the difficulty of nonconvexity in these problems, we introduce a unified approach of projected gradient descent that adapts to the underlying low-rank structure. Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis. Then we further consider a suite of generalized tensor estimation problems, including sub-Gaussian tensor PCA, tensor regression, and Poisson and binomial tensor PCA. We prove that the proposed algorithm achieves the minimax optimal rate of convergence in estimation error. Finally, we demonstrate the superiority of the proposed framework via extensive experiments on both simulated and real data.</description><subject>Algorithms</subject><subject>Convergence</subject><subject>Estimating techniques</subject><subject>Fluid dynamics</subject><subject>Linear equations</subject><subject>Mathematical analysis</subject><subject>Mathematics</subject><subject>Minimax technique</subject><subject>Network analysis</subject><subject>Statistical analysis</subject><subject>Tensors</subject><subject>Upper bounds</subject><issn>0090-5364</issn><issn>2168-8966</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNotkE9LAzEQxYMoWKsXP8GCN2E1M9lkN8dS_AeFHqznME0T2dpu1mSL6Kc3S3sYZvjxePN4jN0CfwCE6hGhnC3fkSs4YxME1ZSNVuqcTTjXvJRCVZfsKqUt51zqSkzYatYVoR_aPe2KNNDQpqG1-aZuU9iw7w8jC10mPtLe_YT4VfgQi0_XuUi79s9tisF1KSOXRptRfc0uPO2SuzntKft4flrNX8vF8uVtPluUFqUeSqyEAPJeWlIKVIVcO7CSUGtcC2Xrpl5Lq7RDWXtckwKqASWBrG1D0oopuzv69jF8H_J_sw2HmMMmg0qIPLLBrLo_qmwMKUXnTR9z0PhrgJuxNYNgTq2Jfy9tX5E</recordid><startdate>20220201</startdate><enddate>20220201</enddate><creator>Han, Rungang</creator><creator>Willett, Rebecca</creator><creator>Zhang, Anru R.</creator><general>Institute of Mathematical Statistics</general><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope></search><sort><creationdate>20220201</creationdate><title>An optimal statistical and computational framework for generalized tensor estimation</title><author>Han, Rungang ; Willett, Rebecca ; Zhang, Anru R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c259t-24331aff5ca66164209e1c5a2992b36c787b5c69e257f2ba61a7125a157c8a5c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Convergence</topic><topic>Estimating techniques</topic><topic>Fluid dynamics</topic><topic>Linear equations</topic><topic>Mathematical analysis</topic><topic>Mathematics</topic><topic>Minimax technique</topic><topic>Network analysis</topic><topic>Statistical analysis</topic><topic>Tensors</topic><topic>Upper bounds</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Han, Rungang</creatorcontrib><creatorcontrib>Willett, Rebecca</creatorcontrib><creatorcontrib>Zhang, Anru R.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>The Annals of statistics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Han, Rungang</au><au>Willett, Rebecca</au><au>Zhang, Anru R.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An optimal statistical and computational framework for generalized tensor estimation</atitle><jtitle>The Annals of statistics</jtitle><date>2022-02-01</date><risdate>2022</risdate><volume>50</volume><issue>1</issue><spage>1</spage><pages>1-</pages><issn>0090-5364</issn><eissn>2168-8966</eissn><abstract>This paper describes a flexible framework for generalized low-rank tensor estimation problems that includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under generalized parametric models. To overcome the difficulty of nonconvexity in these problems, we introduce a unified approach of projected gradient descent that adapts to the underlying low-rank structure. Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis. Then we further consider a suite of generalized tensor estimation problems, including sub-Gaussian tensor PCA, tensor regression, and Poisson and binomial tensor PCA. We prove that the proposed algorithm achieves the minimax optimal rate of convergence in estimation error. Finally, we demonstrate the superiority of the proposed framework via extensive experiments on both simulated and real data.</abstract><cop>Hayward</cop><pub>Institute of Mathematical Statistics</pub><doi>10.1214/21-AOS2061</doi></addata></record>
fulltext fulltext
identifier ISSN: 0090-5364
ispartof The Annals of statistics, 2022-02, Vol.50 (1), p.1
issn 0090-5364
2168-8966
language eng
recordid cdi_proquest_journals_2633263582
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; Project Euclid Complete
subjects Algorithms
Convergence
Estimating techniques
Fluid dynamics
Linear equations
Mathematical analysis
Mathematics
Minimax technique
Network analysis
Statistical analysis
Tensors
Upper bounds
title An optimal statistical and computational framework for generalized tensor estimation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T15%3A45%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20optimal%20statistical%20and%20computational%20framework%20for%20generalized%20tensor%20estimation&rft.jtitle=The%20Annals%20of%20statistics&rft.au=Han,%20Rungang&rft.date=2022-02-01&rft.volume=50&rft.issue=1&rft.spage=1&rft.pages=1-&rft.issn=0090-5364&rft.eissn=2168-8966&rft_id=info:doi/10.1214/21-AOS2061&rft_dat=%3Cproquest_cross%3E2633263582%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2633263582&rft_id=info:pmid/&rfr_iscdi=true