Click: Controllable Text Generation with Sequence Likelihood Contrastive Learning
It has always been an important yet challenging problem to control language models to avoid generating texts with undesirable attributes, such as toxic language and unnatural repetition. We introduce Click for controllable text generation, which needs no modification to the model architecture and fa...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | It has always been an important yet challenging problem to control language
models to avoid generating texts with undesirable attributes, such as toxic
language and unnatural repetition. We introduce Click for controllable text
generation, which needs no modification to the model architecture and
facilitates out-of-the-box use of trained models. It employs a contrastive loss
on sequence likelihood, which fundamentally decreases the generation
probability of negative samples (i.e., generations with undesirable
attributes). It also adopts a novel likelihood ranking-based strategy to
construct contrastive samples from model generations. On the tasks of language
detoxification, sentiment steering, and repetition reduction, we show that
Click outperforms strong baselines of controllable text generation and
demonstrate the superiority of Click's sample construction strategy. |
---|---|
DOI: | 10.48550/arxiv.2306.03350 |