Robust PCA using Matrix Factorization for Background/Foreground Separation
Background/foreground separation has become an inevitable step in numerous image/video processing applications, such as image/video inpainting, anomaly detection, motion segmentation, and augmented reality, etc. Recent low-rank based approaches, such as robust principal component analysis separating...
Gespeichert in:
Veröffentlicht in: | IEEE access 2018-01, Vol.6, p.1-1 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Background/foreground separation has become an inevitable step in numerous image/video processing applications, such as image/video inpainting, anomaly detection, motion segmentation, and augmented reality, etc. Recent low-rank based approaches, such as robust principal component analysis separating a data matrix into a low-rank matrix with a sparse matrix, have achieved encouraging performance. However, these approaches usually need relatively high computation cost, mainly due to calculation of full or partial singular value decomposition of large matrices. On the other hand, the nuclear norm is widely exploited as a convex surrogate of the original rank function, while it is not a tighter envelope of the original rank function. To address these above mentioned issues, this paper proposes a fast background/foreground separation algorithm in which the low-rank constraint is solved by a matrix factorization scheme, thus heavily reducing the computation cost. We further adopt two non-convex low-rank approximations to improve the robustness and flexibility of the traditional nuclear norm. Comparing with the state-of-the-art low-rank reconstruction methods, experimental results on challenging datasets which contain different real datasets show our superior performance in both image clarity and computation efficiency. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2018.2818322 |