Principled Paraphrase Generation with Parallel Corpora
Round-trip Machine Translation (MT) is a popular choice for paraphrase generation, which leverages readily available parallel corpora for supervision. In this paper, we formalize the implicit similarity function induced by this approach, and show that it is susceptible to non-paraphrase pairs sharin...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Round-trip Machine Translation (MT) is a popular choice for paraphrase
generation, which leverages readily available parallel corpora for supervision.
In this paper, we formalize the implicit similarity function induced by this
approach, and show that it is susceptible to non-paraphrase pairs sharing a
single ambiguous translation. Based on these insights, we design an alternative
similarity metric that mitigates this issue by requiring the entire translation
distribution to match, and implement a relaxation of it through the Information
Bottleneck method. Our approach incorporates an adversarial term into MT
training in order to learn representations that encode as much information
about the reference translation as possible, while keeping as little
information about the input as possible. Paraphrases can be generated by
decoding back to the source from this representation, without having to
generate pivot translations. In addition to being more principled and efficient
than round-trip MT, our approach offers an adjustable parameter to control the
fidelity-diversity trade-off, and obtains better results in our experiments. |
---|---|
DOI: | 10.48550/arxiv.2205.12213 |