Least Square Approach to Out-of-Sample Extensions of Diffusion Maps
Let X = X ∪ Z be a data set in RD, where X is the training set and Z the testing one. Assume that a kernel method produces a dimensionality reduction (DR) mapping F: X → Rd (d ≪ D) that maps the high-dimensional data X to its row-dimensional representation Y = F(X). The out-of-sample extension of di...
Gespeichert in:
Veröffentlicht in: | Frontiers in applied mathematics and statistics 2019-05, Vol.5 |
---|---|
1. Verfasser: | |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Let X = X ∪ Z be a data set in RD, where X is the training set and Z the testing one. Assume that a kernel method produces a dimensionality reduction (DR) mapping F: X → Rd (d ≪ D) that maps the high-dimensional data X to its row-dimensional representation Y = F(X). The out-of-sample extension of dimensionality reduction problem is to find the dimensionality reduction of X using the extension of F instead of re-training the whole data set X. In this paper, utilizing the framework of reproducing kernel Hilbert space theory, we introduce a least-square approach to extensions of the popular DR mappings called Diffusion maps (Dmaps). We establish a theoretic analysis for the out-of-sample DR Dmaps. This analysis also provides a uniform treatment of many popular out-of-sample algorithms based on kernel methods. We illustrate the validity of the developed out-of-sample DR algorithms in several examples. |
---|---|
ISSN: | 2297-4687 2297-4687 |
DOI: | 10.3389/fams.2019.00024 |