On the Complexity of Robust PCA and ℓ 1 -Norm Low-Rank Matrix Approximation

The low-rank matrix approximation problem with respect to the component-wise ℓ 1 -norm (ℓ 1 -LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims to recover a low-rank matrix that was perturbe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Mathematics of operations research 2018-11, Vol.43 (4), p.1072-1084
Hauptverfasser: Gillis, Nicolas, Vavasis, Stephen A.
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The low-rank matrix approximation problem with respect to the component-wise ℓ 1 -norm (ℓ 1 -LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims to recover a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although ℓ 1 -LRA is strongly believed to be NP-hard, there is, to our knowledge, no formal proof of this fact. In this paper, we prove that ℓ 1 -LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between ℓ 1 -LRA and several other well-known problems, i.e., robust PCA, ℓ 0 -LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {−1, + 1} matrices, and the discrete basis problem, all of which we prove to be NP-hard.
ISSN:0364-765X
1526-5471
DOI:10.1287/moor.2017.0895