GumBolt: Extending Gumbel trick to Boltzmann priors
Boltzmann machines (BMs) are appealing candidates for powerful priors in variational autoencoders (VAEs), as they are capable of capturing nontrivial and multi-modal distributions over discrete variables. However, non-differentiability of the discrete units prohibits using the reparameterization tri...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Boltzmann machines (BMs) are appealing candidates for powerful priors in
variational autoencoders (VAEs), as they are capable of capturing nontrivial
and multi-modal distributions over discrete variables. However,
non-differentiability of the discrete units prohibits using the
reparameterization trick, essential for low-noise back propagation. The Gumbel
trick resolves this problem in a consistent way by relaxing the variables and
distributions, but it is incompatible with BM priors. Here, we propose the
GumBolt, a model that extends the Gumbel trick to BM priors in VAEs. GumBolt is
significantly simpler than the recently proposed methods with BM prior and
outperforms them by a considerable margin. It achieves state-of-the-art
performance on permutation invariant MNIST and OMNIGLOT datasets in the scope
of models with only discrete latent variables. Moreover, the performance can be
further improved by allowing multi-sampled (importance-weighted) estimation of
log-likelihood in training, which was not possible with previous models. |
---|---|
DOI: | 10.48550/arxiv.1805.07349 |