Dimensionality reduction via kernel sparse representation
Dimensionality reduction (DR) methods based on sparse representation as one of the hottest research topics have achieved remarkable performance in many applications in recent years. However, it's a challenge for existing sparse representation based methods to solve nonlinear problem due to the limit...
Gespeichert in:
Veröffentlicht in: | Frontiers of Computer Science 2014-10, Vol.8 (5), p.807-815 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Dimensionality reduction (DR) methods based on sparse representation as one of the hottest research topics have achieved remarkable performance in many applications in recent years. However, it's a challenge for existing sparse representation based methods to solve nonlinear problem due to the limitations of seeking sparse representation of data in the original space. Motivated by kernel tricks, we proposed a new framework called empirical kernel sparse representation (EKSR) to solve nonlinear problem. In this framework, non- linear separable data are mapped into kernel space in which the nonlinear similarity can be captured, and then the data in kernel space is reconstructed by sparse representation to preserve the sparse structure, which is obtained by minimiz- ing a ~1 regularization-related objective function. EKSR pro- vides new insights into dimensionality reduction and extends two models: 1) empirical kernel sparsity preserving projec- tion (EKSPP), which is a feature extraction method based on sparsity preserving projection (SPP); 2) empirical kernel sparsity score (EKSS), which is a feature selection method based on sparsity score (SS). Both of the two methods can choose neighborhood automatically as the natural discrimi- native power of sparse representation. Compared with sev- eral existing approaches, the proposed framework can reduce computational complexity and be more convenient in prac- tice. |
---|---|
ISSN: | 2095-2228 2095-2236 |
DOI: | 10.1007/s11704-014-3317-1 |