Nonlinear Dimension Reduction by PDF Estimation

A new information criterion is proposed for nonlinear dimension reduction (NLDR) based on probability density function (PDF) estimation. As the PDF is estimated, the transformation to the latent space is learned and information transfer is maximized. The method (a) maximizes information at the outpu...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on signal processing 2022, Vol.70, p.1493-1505
Hauptverfasser: Baggenstoss, Paul M., Kay, Steven
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:A new information criterion is proposed for nonlinear dimension reduction (NLDR) based on probability density function (PDF) estimation. As the PDF is estimated, the transformation to the latent space is learned and information transfer is maximized. The method (a) maximizes information at the output, (b) makes no assumptions about the data structure, (b) is invariant to invertible transformations, (c) can produce any desired output distribution (such as independent uniform or Gaussian latent variables) and (d) is completely general. In addition to performing dimension reduction, the approach results in a complete statistical model of the data including tractable likelihood function and ability to generate synthetic data. The method specializes to principal component analysis (PCA) for the linear/Gaussian case. When the transformation has limited approximation power, the trade-off between information transfer and approximating the desired output distribution can be controlled using a constant \beta, which is analogous to \beta-VAE. For efficiency, the method can be implemented with a neural network architecture using a variation of PDF projection, called projected belief network (PBN). In experiments with high-dimensional non-Gaussian input data, the superiority of PBN is shown relative to PCA, restricted Boltzmann machine (RBM), and a \beta-variational auto-encoder (\beta-VAE).
ISSN:1053-587X
1941-0476
DOI:10.1109/TSP.2022.3151317