An Out-of-Sample Extension to Manifold Learning via Meta-Modeling
Unsupervised manifold learning has become accepted as an important tool for reducing dimensionality of a dataset by finding its meaningful low-dimensional representation lying on an unknown nonlinear subspace. Most manifold learning methods only embed an existing dataset but do not provide an explic...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on image processing 2019-10, Vol.28 (10), p.5227-5237 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Unsupervised manifold learning has become accepted as an important tool for reducing dimensionality of a dataset by finding its meaningful low-dimensional representation lying on an unknown nonlinear subspace. Most manifold learning methods only embed an existing dataset but do not provide an explicit mapping function for novel out-of-sample data, thereby potentially resulting in an ineffective tool for classification purposes, particularly for iterative methods, such as active learning. To address this issue, out-of-sample extension methods have been introduced to generalize an existing embedding of new samples. In this paper, a novel out-of-sample method is introduced by utilizing high dimensional model representation (HDMR) as a nonlinear multivariate regression with the Tikhonov regularizer for unsupervised manifold learning algorithms. The proposed method was extensively analyzed using illustrative datasets sampled from known manifolds. Several experiments with 3D synthetic datasets and face recognition datasets were also conducted, and the performance of the proposed method was compared to several well-known out-of-sample methods. The results obtained with locally linear embedding (LLE), Laplacian Eigenmaps (LE), and t-distributed stochastic neighbor embedding (t-SNE) showed that the proposed method achieves competitive even better performance than the other out-of-sample methods. |
---|---|
ISSN: | 1057-7149 1941-0042 |
DOI: | 10.1109/TIP.2019.2915162 |