beta$-VAEs can retain label information even at high compression
In this paper, we investigate the degree to which the encoding of a $\beta$-VAE captures label information across multiple architectures on Binary Static MNIST and Omniglot. Even though they are trained in a completely unsupervised manner, we demonstrate that a $\beta$-VAE can retain a large amount...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we investigate the degree to which the encoding of a
$\beta$-VAE captures label information across multiple architectures on Binary
Static MNIST and Omniglot. Even though they are trained in a completely
unsupervised manner, we demonstrate that a $\beta$-VAE can retain a large
amount of label information, even when asked to learn a highly compressed
representation. |
---|---|
DOI: | 10.48550/arxiv.1812.02682 |