Cross-lingual Transfer of Monolingual Models

Recent studies in zero-shot cross-lingual learning using multilingual models have falsified the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. Inspired by this advancement, we introduce a cross-lingual transfer method for monolingual m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Gogoulou, Evangelia, Ekgren, Ariel, Isbister, Tim, Sahlgren, Magnus
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Gogoulou, Evangelia
Ekgren, Ariel
Isbister, Tim
Sahlgren, Magnus
description Recent studies in zero-shot cross-lingual learning using multilingual models have falsified the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. Inspired by this advancement, we introduce a cross-lingual transfer method for monolingual models based on domain adaptation. We study the effects of such transfer from four different languages to English. Our experimental results on GLUE show that the transferred models outperform the native English model independently of the source language. After probing the English linguistic knowledge encoded in the representations before and after transfer, we find that semantic information is retained from the source language, while syntactic information is learned during transfer. Additionally, the results of evaluating the transferred models in source language tasks reveal that their performance in the source domain deteriorates after transfer.
doi_str_mv 10.48550/arxiv.2109.07348
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2109_07348</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2109_07348</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-a7a67bd87933d578d66479086f87b59f12f16f0176693207a7d085ac7dc808f63</originalsourceid><addsrcrecordid>eNo1zrFOwzAUhWEvDKjwAEzkAZpwHce-1yOKWlopFUv26LaOq0ghRrZA9O3bhnb6pTMcfUK8SCgq0hreOP4Nv0UpwRaAqqJHsaxjSCkfh-n4w2PWRp6S72MWfLYLU7jvu-D6MT2JB89j6p9vXYh2vWrrTd58fmzr9yZng5QzXrJ3hFYpp5GcMRVaIOMJ99p6WXppPEg0xqoSkNEBaT6gOxCQN2ohXv9vZ273HYcvjqfuyu5mtjoDydc7dA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Cross-lingual Transfer of Monolingual Models</title><source>arXiv.org</source><creator>Gogoulou, Evangelia ; Ekgren, Ariel ; Isbister, Tim ; Sahlgren, Magnus</creator><creatorcontrib>Gogoulou, Evangelia ; Ekgren, Ariel ; Isbister, Tim ; Sahlgren, Magnus</creatorcontrib><description>Recent studies in zero-shot cross-lingual learning using multilingual models have falsified the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. Inspired by this advancement, we introduce a cross-lingual transfer method for monolingual models based on domain adaptation. We study the effects of such transfer from four different languages to English. Our experimental results on GLUE show that the transferred models outperform the native English model independently of the source language. After probing the English linguistic knowledge encoded in the representations before and after transfer, we find that semantic information is retained from the source language, while syntactic information is learned during transfer. Additionally, the results of evaluating the transferred models in source language tasks reveal that their performance in the source domain deteriorates after transfer.</description><identifier>DOI: 10.48550/arxiv.2109.07348</identifier><language>eng</language><subject>Computer Science - Computation and Language ; Computer Science - Learning</subject><creationdate>2021-09</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2109.07348$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2109.07348$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Gogoulou, Evangelia</creatorcontrib><creatorcontrib>Ekgren, Ariel</creatorcontrib><creatorcontrib>Isbister, Tim</creatorcontrib><creatorcontrib>Sahlgren, Magnus</creatorcontrib><title>Cross-lingual Transfer of Monolingual Models</title><description>Recent studies in zero-shot cross-lingual learning using multilingual models have falsified the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. Inspired by this advancement, we introduce a cross-lingual transfer method for monolingual models based on domain adaptation. We study the effects of such transfer from four different languages to English. Our experimental results on GLUE show that the transferred models outperform the native English model independently of the source language. After probing the English linguistic knowledge encoded in the representations before and after transfer, we find that semantic information is retained from the source language, while syntactic information is learned during transfer. Additionally, the results of evaluating the transferred models in source language tasks reveal that their performance in the source domain deteriorates after transfer.</description><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo1zrFOwzAUhWEvDKjwAEzkAZpwHce-1yOKWlopFUv26LaOq0ghRrZA9O3bhnb6pTMcfUK8SCgq0hreOP4Nv0UpwRaAqqJHsaxjSCkfh-n4w2PWRp6S72MWfLYLU7jvu-D6MT2JB89j6p9vXYh2vWrrTd58fmzr9yZng5QzXrJ3hFYpp5GcMRVaIOMJ99p6WXppPEg0xqoSkNEBaT6gOxCQN2ohXv9vZ273HYcvjqfuyu5mtjoDydc7dA</recordid><startdate>20210915</startdate><enddate>20210915</enddate><creator>Gogoulou, Evangelia</creator><creator>Ekgren, Ariel</creator><creator>Isbister, Tim</creator><creator>Sahlgren, Magnus</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210915</creationdate><title>Cross-lingual Transfer of Monolingual Models</title><author>Gogoulou, Evangelia ; Ekgren, Ariel ; Isbister, Tim ; Sahlgren, Magnus</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-a7a67bd87933d578d66479086f87b59f12f16f0176693207a7d085ac7dc808f63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Gogoulou, Evangelia</creatorcontrib><creatorcontrib>Ekgren, Ariel</creatorcontrib><creatorcontrib>Isbister, Tim</creatorcontrib><creatorcontrib>Sahlgren, Magnus</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Gogoulou, Evangelia</au><au>Ekgren, Ariel</au><au>Isbister, Tim</au><au>Sahlgren, Magnus</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Cross-lingual Transfer of Monolingual Models</atitle><date>2021-09-15</date><risdate>2021</risdate><abstract>Recent studies in zero-shot cross-lingual learning using multilingual models have falsified the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. Inspired by this advancement, we introduce a cross-lingual transfer method for monolingual models based on domain adaptation. We study the effects of such transfer from four different languages to English. Our experimental results on GLUE show that the transferred models outperform the native English model independently of the source language. After probing the English linguistic knowledge encoded in the representations before and after transfer, we find that semantic information is retained from the source language, while syntactic information is learned during transfer. Additionally, the results of evaluating the transferred models in source language tasks reveal that their performance in the source domain deteriorates after transfer.</abstract><doi>10.48550/arxiv.2109.07348</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2109.07348
ispartof
issn
language eng
recordid cdi_arxiv_primary_2109_07348
source arXiv.org
subjects Computer Science - Computation and Language
Computer Science - Learning
title Cross-lingual Transfer of Monolingual Models
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-09T15%3A24%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Cross-lingual%20Transfer%20of%20Monolingual%20Models&rft.au=Gogoulou,%20Evangelia&rft.date=2021-09-15&rft_id=info:doi/10.48550/arxiv.2109.07348&rft_dat=%3Carxiv_GOX%3E2109_07348%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true