Recurrent Estimation of Distributions
This paper presents the recurrent estimation of distributions (RED) for modeling real-valued data in a semiparametric fashion. RED models make two novel uses of recurrent neural networks (RNNs) for density estimation of general real-valued data. First, RNNs are used to transform input covariates int...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents the recurrent estimation of distributions (RED) for
modeling real-valued data in a semiparametric fashion. RED models make two
novel uses of recurrent neural networks (RNNs) for density estimation of
general real-valued data. First, RNNs are used to transform input covariates
into a latent space to better capture conditional dependencies in inputs.
After, an RNN is used to compute the conditional distributions of the latent
covariates. The resulting model is efficient to train, compute, and sample
from, whilst producing normalized pdfs. The effectiveness of RED is shown via
several real-world data experiments. Our results show that RED models achieve a
lower held-out negative log-likelihood than other neural network approaches
across multiple dataset sizes and dimensionalities. Further context of the
efficacy of RED is provided by considering anomaly detection tasks, where we
also observe better performance over alternative models. |
---|---|
DOI: | 10.48550/arxiv.1705.10750 |