(\beta\)-VAEs can retain label information even at high compression
In this paper, we investigate the degree to which the encoding of a \(\beta\)-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot. Even though they are trained in a completely unsupervised manner, we demonstrate that a \(\beta\)-VAE can retain a large amo...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2018-12 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we investigate the degree to which the encoding of a \(\beta\)-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot. Even though they are trained in a completely unsupervised manner, we demonstrate that a \(\beta\)-VAE can retain a large amount of label information, even when asked to learn a highly compressed representation. |
---|---|
ISSN: | 2331-8422 |