(\beta\)-VAEs can retain label information even at high compression

In this paper, we investigate the degree to which the encoding of a \(\beta\)-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot. Even though they are trained in a completely unsupervised manner, we demonstrate that a \(\beta\)-VAE can retain a large amo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2018-12
Hauptverfasser: Fertig, Emily, Arbabi, Aryan, Alemi, Alexander A
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we investigate the degree to which the encoding of a \(\beta\)-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot. Even though they are trained in a completely unsupervised manner, we demonstrate that a \(\beta\)-VAE can retain a large amount of label information, even when asked to learn a highly compressed representation.
ISSN:2331-8422