Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer

We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propos...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Lai, Huiyuan, Toral, Antonio, Nissim, Malvina
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Lai, Huiyuan
Toral, Antonio
Nissim, Malvina
description We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.
doi_str_mv 10.48550/arxiv.2203.08552
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2203_08552</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2203_08552</sourcerecordid><originalsourceid>FETCH-LOGICAL-a672-fad8e4d65887442d78914fe89bbe5a92fc74c11bdfa028faa3fb4118330498e03</originalsourceid><addsrcrecordid>eNpVj8tOwzAURL1hgQofwIr7Awl-pXGWVcVLSgUS3oeb-rpYBLdyXGj_nrZ0w2o0M5qRDmM3gpfaVBW_w7QL36WUXJX8EMhL9r7YDjkMIa62OMBroiInDPHg4SfkD2jx2KwIMDqwOH7CzOEmYw7rCH6d4N_e0i7DW94PBDZhHD2lK3bhcRjp-qwTZh_u7fypaF8en-eztsBpLQuPzpB208qYWmvpatMI7ck0fU8VNtIva70UonceuTQeUfleC2GU4roxxNWE3f7dnhC7TQpfmPbdEbU7oapfRFRQCA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer</title><source>arXiv.org</source><creator>Lai, Huiyuan ; Toral, Antonio ; Nissim, Malvina</creator><creatorcontrib>Lai, Huiyuan ; Toral, Antonio ; Nissim, Malvina</creatorcontrib><description>We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.</description><identifier>DOI: 10.48550/arxiv.2203.08552</identifier><language>eng</language><subject>Computer Science - Computation and Language</subject><creationdate>2022-03</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2203.08552$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2203.08552$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Lai, Huiyuan</creatorcontrib><creatorcontrib>Toral, Antonio</creatorcontrib><creatorcontrib>Nissim, Malvina</creatorcontrib><title>Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer</title><description>We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.</description><subject>Computer Science - Computation and Language</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpVj8tOwzAURL1hgQofwIr7Awl-pXGWVcVLSgUS3oeb-rpYBLdyXGj_nrZ0w2o0M5qRDmM3gpfaVBW_w7QL36WUXJX8EMhL9r7YDjkMIa62OMBroiInDPHg4SfkD2jx2KwIMDqwOH7CzOEmYw7rCH6d4N_e0i7DW94PBDZhHD2lK3bhcRjp-qwTZh_u7fypaF8en-eztsBpLQuPzpB208qYWmvpatMI7ck0fU8VNtIva70UonceuTQeUfleC2GU4roxxNWE3f7dnhC7TQpfmPbdEbU7oapfRFRQCA</recordid><startdate>20220316</startdate><enddate>20220316</enddate><creator>Lai, Huiyuan</creator><creator>Toral, Antonio</creator><creator>Nissim, Malvina</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220316</creationdate><title>Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer</title><author>Lai, Huiyuan ; Toral, Antonio ; Nissim, Malvina</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a672-fad8e4d65887442d78914fe89bbe5a92fc74c11bdfa028faa3fb4118330498e03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computation and Language</topic><toplevel>online_resources</toplevel><creatorcontrib>Lai, Huiyuan</creatorcontrib><creatorcontrib>Toral, Antonio</creatorcontrib><creatorcontrib>Nissim, Malvina</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lai, Huiyuan</au><au>Toral, Antonio</au><au>Nissim, Malvina</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer</atitle><date>2022-03-16</date><risdate>2022</risdate><abstract>We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.</abstract><doi>10.48550/arxiv.2203.08552</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2203.08552
ispartof
issn
language eng
recordid cdi_arxiv_primary_2203_08552
source arXiv.org
subjects Computer Science - Computation and Language
title Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T20%3A14%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multilingual%20Pre-training%20with%20Language%20and%20Task%20Adaptation%20for%20Multilingual%20Text%20Style%20Transfer&rft.au=Lai,%20Huiyuan&rft.date=2022-03-16&rft_id=info:doi/10.48550/arxiv.2203.08552&rft_dat=%3Carxiv_GOX%3E2203_08552%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true