Controllable Text Generation via Probability Density Estimation in the Latent Space
Previous work on controllable text generation has explored the idea of control from the latent space, such as optimizing a representation with attribute-related classifiers or sampling a representation from relevant discrete samples. However, they are not effective enough in modeling both the latent...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Previous work on controllable text generation has explored the idea of
control from the latent space, such as optimizing a representation with
attribute-related classifiers or sampling a representation from relevant
discrete samples. However, they are not effective enough in modeling both the
latent space and the control, leaving controlled text with low quality and
diversity. In this work, we propose a novel control framework using probability
density estimation in the latent space. Our method utilizes an invertible
transformation function, the Normalizing Flow, that maps the complex
distributions in the latent space to simple Gaussian distributions in the prior
space. Thus, we can perform sophisticated and flexible control in the prior
space and feed the control effects back into the latent space owing to the
one-one-mapping property of invertible transformations. Experiments on
single-attribute controls and multi-attribute control reveal that our method
outperforms several strong baselines on attribute relevance and text quality
and achieves the SOTA. Further analysis of control strength adjustment
demonstrates the flexibility of our control strategy. |
---|---|
DOI: | 10.48550/arxiv.2212.08307 |