Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory
This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper tackles the problem of recovering a low-rank signal tensor with
possibly correlated components from a random noisy tensor, or so-called spiked
tensor model. When the underlying components are orthogonal, they can be
recovered efficiently using tensor deflation which consists of successive
rank-one approximations, while non-orthogonal components may alter the tensor
deflation mechanism, thereby preventing efficient recovery. Relying on recently
developed random tensor tools, this paper deals precisely with the
non-orthogonal case by deriving an asymptotic analysis of a parameterized
deflation procedure performed on an order-three and rank-two spiked tensor.
Based on this analysis, an efficient tensor deflation algorithm is proposed by
optimizing the parameter introduced in the deflation mechanism, which in turn
is proven to be optimal by construction for the studied tensor model. The same
ideas could be extended to more general low-rank tensor models, e.g., higher
ranks and orders, leading to more efficient tensor methods with a broader
impact on machine learning and beyond. |
---|---|
DOI: | 10.48550/arxiv.2302.05798 |