Using latent space regression to analyze and leverage compositionality in GANs
In recent years, Generative Adversarial Networks have become ubiquitous in both research and public perception, but how GANs convert an unstructured latent code to a high quality output is still an open question. In this work, we investigate regression into the latent space as a probe to understand...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent years, Generative Adversarial Networks have become ubiquitous in
both research and public perception, but how GANs convert an unstructured
latent code to a high quality output is still an open question. In this work,
we investigate regression into the latent space as a probe to understand the
compositional properties of GANs. We find that combining the regressor and a
pretrained generator provides a strong image prior, allowing us to create
composite images from a collage of random image parts at inference time while
maintaining global consistency. To compare compositional properties across
different generators, we measure the trade-offs between reconstruction of the
unrealistic input and image quality of the regenerated samples. We find that
the regression approach enables more localized editing of individual image
parts compared to direct editing in the latent space, and we conduct
experiments to quantify this independence effect. Our method is agnostic to the
semantics of edits, and does not require labels or predefined concepts during
training. Beyond image composition, our method extends to a number of related
applications, such as image inpainting or example-based image editing, which we
demonstrate on several GANs and datasets, and because it uses only a single
forward pass, it can operate in real-time. Code is available on our project
page: https://chail.github.io/latent-composition/. |
---|---|
DOI: | 10.48550/arxiv.2103.10426 |