The deep kernelized autoencoder
[Display omitted] •Learning how to approximate mapping into a kernel space with a deep neural network.•Autoencoder that exploits information provided by a user-defined kernel matrix.•Learn representations that preserve non-linear similarities in the input space.•Drawing connections between autoencod...
Gespeichert in:
Veröffentlicht in: | Applied soft computing 2018-10, Vol.71, p.816-825 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | [Display omitted]
•Learning how to approximate mapping into a kernel space with a deep neural network.•Autoencoder that exploits information provided by a user-defined kernel matrix.•Learn representations that preserve non-linear similarities in the input space.•Drawing connections between autoencoders and kernel methods.•Learning implicit mappings from approximated kernel space back to the input space.
Autoencoders learn data representations (codes) in such a way that the input is reproduced at the output of the network. However, it is not always clear what kind of properties of the input data need to be captured by the codes. Kernel machines have experienced great success by operating via inner-products in a theoretically well-defined reproducing kernel Hilbert space, hence capturing topological properties of input data. In this paper, we enhance the autoencoder's ability to learn effective data representations by aligning inner products between codes with respect to a kernel matrix. By doing so, the proposed kernelized autoencoder allows learning similarity-preserving embeddings of input data, where the notion of similarity is explicitly controlled by the user and encoded in a positive semi-definite kernel matrix. Experiments are performed for evaluating both reconstruction and kernel alignment performance in classification tasks and visualization of high-dimensional data. Additionally, we show that our method is capable to emulate kernel principal component analysis on a denoising task, obtaining competitive results at a much lower computational cost. |
---|---|
ISSN: | 1568-4946 1872-9681 1872-9681 |
DOI: | 10.1016/j.asoc.2018.07.029 |