Connections Between Nuclear-Norm and Frobenius-Norm-Based Representations
A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to und...
Gespeichert in:
Veröffentlicht in: | IEEE transaction on neural networks and learning systems 2018-01, Vol.29 (1), p.218-224 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | A lot of works have shown that frobenius-norm-based representation (FNR) is competitive to sparse representation and nuclear-norm-based representation (NNR) in numerous tasks such as subspace clustering. Despite the success of FNR in experimental studies, less theoretical analysis is provided to understand its working mechanism. In this brief, we fill this gap by building the theoretical connections between FNR and NNR. More specially, we prove that: 1) when the dictionary can provide enough representative capacity, FNR is exactly NNR even though the data set contains the Gaussian noise, Laplacian noise, or sample-specified corruption and 2) otherwise, FNR and NNR are two solutions on the column space of the dictionary. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2016.2608834 |