Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer
We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propos...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We exploit the pre-trained seq2seq model mBART for multilingual text style
transfer. Using machine translated data as well as gold aligned English
sentences yields state-of-the-art results in the three target languages we
consider. Besides, in view of the general scarcity of parallel data, we propose
a modular approach for multilingual formality transfer, which consists of two
training strategies that target adaptation to both language and task. Our
approach achieves competitive performance without monolingual task-specific
parallel data and can be applied to other style transfer tasks as well as to
other languages. |
---|---|
DOI: | 10.48550/arxiv.2203.08552 |