Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory

This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Seddik, Mohamed El Amine, Mahfoud, Mohammed, Debbah, Merouane
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Seddik, Mohamed El Amine
Mahfoud, Mohammed
Debbah, Merouane
description This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.
doi_str_mv 10.48550/arxiv.2302.05798
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2302_05798</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2302_05798</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-6d3d4233c2c975ca7d5d0a890d4912eb08f8feb6c06ea99ee87ecdd144953fef3</originalsourceid><addsrcrecordid>eNo1z0tqwzAUhWFNOihpF9BRtQG7sh6WNCzuEwKG4Lm5ka4SgS0FxYQmqy9N29EZ_HDgI-ShYbU0SrEnKF_xVHPBeM2UtuaWdP1hiXO8xLSjfVn2eZcTTPGCng6YjrnQFwwTLDEneopAN5B8nv_bsMdcznfkJsB0xPu_XZHh7XXoPqp1__7ZPa8raLWpWi-85EI47qxWDrRXnoGxzEvbcNwyE0zAbetYi2AtotHovG-ktEoEDGJFHn9vr4rxUOIM5Tz-aMarRnwDjCxGLA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory</title><source>arXiv.org</source><creator>Seddik, Mohamed El Amine ; Mahfoud, Mohammed ; Debbah, Merouane</creator><creatorcontrib>Seddik, Mohamed El Amine ; Mahfoud, Mohammed ; Debbah, Merouane</creatorcontrib><description>This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.</description><identifier>DOI: 10.48550/arxiv.2302.05798</identifier><language>eng</language><subject>Mathematics - Probability ; Mathematics - Statistics Theory ; Statistics - Machine Learning ; Statistics - Theory</subject><creationdate>2023-02</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2302.05798$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2302.05798$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Seddik, Mohamed El Amine</creatorcontrib><creatorcontrib>Mahfoud, Mohammed</creatorcontrib><creatorcontrib>Debbah, Merouane</creatorcontrib><title>Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory</title><description>This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.</description><subject>Mathematics - Probability</subject><subject>Mathematics - Statistics Theory</subject><subject>Statistics - Machine Learning</subject><subject>Statistics - Theory</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo1z0tqwzAUhWFNOihpF9BRtQG7sh6WNCzuEwKG4Lm5ka4SgS0FxYQmqy9N29EZ_HDgI-ShYbU0SrEnKF_xVHPBeM2UtuaWdP1hiXO8xLSjfVn2eZcTTPGCng6YjrnQFwwTLDEneopAN5B8nv_bsMdcznfkJsB0xPu_XZHh7XXoPqp1__7ZPa8raLWpWi-85EI47qxWDrRXnoGxzEvbcNwyE0zAbetYi2AtotHovG-ktEoEDGJFHn9vr4rxUOIM5Tz-aMarRnwDjCxGLA</recordid><startdate>20230211</startdate><enddate>20230211</enddate><creator>Seddik, Mohamed El Amine</creator><creator>Mahfoud, Mohammed</creator><creator>Debbah, Merouane</creator><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20230211</creationdate><title>Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory</title><author>Seddik, Mohamed El Amine ; Mahfoud, Mohammed ; Debbah, Merouane</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-6d3d4233c2c975ca7d5d0a890d4912eb08f8feb6c06ea99ee87ecdd144953fef3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Mathematics - Probability</topic><topic>Mathematics - Statistics Theory</topic><topic>Statistics - Machine Learning</topic><topic>Statistics - Theory</topic><toplevel>online_resources</toplevel><creatorcontrib>Seddik, Mohamed El Amine</creatorcontrib><creatorcontrib>Mahfoud, Mohammed</creatorcontrib><creatorcontrib>Debbah, Merouane</creatorcontrib><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Seddik, Mohamed El Amine</au><au>Mahfoud, Mohammed</au><au>Debbah, Merouane</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory</atitle><date>2023-02-11</date><risdate>2023</risdate><abstract>This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.</abstract><doi>10.48550/arxiv.2302.05798</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2302.05798
ispartof
issn
language eng
recordid cdi_arxiv_primary_2302_05798
source arXiv.org
subjects Mathematics - Probability
Mathematics - Statistics Theory
Statistics - Machine Learning
Statistics - Theory
title Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T19%3A44%3A28IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Optimizing%20Orthogonalized%20Tensor%20Deflation%20via%20Random%20Tensor%20Theory&rft.au=Seddik,%20Mohamed%20El%20Amine&rft.date=2023-02-11&rft_id=info:doi/10.48550/arxiv.2302.05798&rft_dat=%3Carxiv_GOX%3E2302_05798%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true