Robust Lasso With Missing and Grossly Corrupted Observations

This paper studies the problem of accurately recovering a k -sparse vector β * ∈ \BBR p from highly corrupted linear measurements y = X β * + e * + w , where e * ∈ \BBR n is a sparse error vector whose nonzero entries may be unbounded and w is a stochastic noise term. We propose a so-called extended...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information theory 2013-04, Vol.59 (4), p.2036-2058
Hauptverfasser: Nguyen, N. H., Tran, T. D.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper studies the problem of accurately recovering a k -sparse vector β * ∈ \BBR p from highly corrupted linear measurements y = X β * + e * + w , where e * ∈ \BBR n is a sparse error vector whose nonzero entries may be unbounded and w is a stochastic noise term. We propose a so-called extended Lasso optimization which takes into consideration sparse prior information of both β * and e * . Our first result shows that the extended Lasso can faithfully recover both the regression as well as the corruption vector. Our analysis relies on the notion of extended restricted eigenvalue for the design matrix X . Our second set of results applies to a general class of Gaussian design matrix X with i.i.d. rows N (0,Σ), for which we can establish a surprising result: the extended Lasso can recover exact signed supports of both β * and e * from only Ω( k log p log n ) observations, even when a linear fraction of observations is grossly corrupted. Our analysis also shows that this amount of observations required to achieve exact signed support is indeed optimal.
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2012.2232347