Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU

Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Christopoulou, Fenia, Lampouras, Gerasimos, Iacobacci, Ignacio
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Christopoulou, Fenia
Lampouras, Gerasimos
Iacobacci, Ignacio
description Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented or task-agnostic difficulties. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e., statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution (ID), out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5% in certain cases. Overall, experiments indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while being 20% faster on average. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.
doi_str_mv 10.48550/arxiv.2210.12499
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2210_12499</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2210_12499</sourcerecordid><originalsourceid>FETCH-LOGICAL-a679-b4c259c7e97ddd0d4c75abb6379d7ecd0cf74a6883fbf5a41db43eef88c795dc3</originalsourceid><addsrcrecordid>eNo1j8tKxDAYRrNxIaMP4GryAh3bJmkSd0O9QtXFVFyWPzcJtImkRuzbOxddfXD4OHAQuqrKDRWMldeQfvz3pq73oKqplOfovU_ggw8f-HYJMHk9YxcTbnNKXucxT7izkA6HG7zFu69sFhwDfo4hjnuYYcQQDG5TnOfin7x0bxfozME428u_XaH-_q5vH4vu9eGp3XYFNFwWiuqaSc2t5MaY0lDNGSjVEC4Nt9qU2nEKjRDEKceAVkZRYq0TQnPJjCYrtD5pj2XDZ_ITpGU4FA7HQvILuIlNAQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU</title><source>arXiv.org</source><creator>Christopoulou, Fenia ; Lampouras, Gerasimos ; Iacobacci, Ignacio</creator><creatorcontrib>Christopoulou, Fenia ; Lampouras, Gerasimos ; Iacobacci, Ignacio</creatorcontrib><description>Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented or task-agnostic difficulties. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e., statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution (ID), out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5% in certain cases. Overall, experiments indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while being 20% faster on average. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.</description><identifier>DOI: 10.48550/arxiv.2210.12499</identifier><language>eng</language><subject>Computer Science - Computation and Language</subject><creationdate>2022-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2210.12499$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2210.12499$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Christopoulou, Fenia</creatorcontrib><creatorcontrib>Lampouras, Gerasimos</creatorcontrib><creatorcontrib>Iacobacci, Ignacio</creatorcontrib><title>Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU</title><description>Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented or task-agnostic difficulties. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e., statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution (ID), out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5% in certain cases. Overall, experiments indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while being 20% faster on average. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.</description><subject>Computer Science - Computation and Language</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNo1j8tKxDAYRrNxIaMP4GryAh3bJmkSd0O9QtXFVFyWPzcJtImkRuzbOxddfXD4OHAQuqrKDRWMldeQfvz3pq73oKqplOfovU_ggw8f-HYJMHk9YxcTbnNKXucxT7izkA6HG7zFu69sFhwDfo4hjnuYYcQQDG5TnOfin7x0bxfozME428u_XaH-_q5vH4vu9eGp3XYFNFwWiuqaSc2t5MaY0lDNGSjVEC4Nt9qU2nEKjRDEKceAVkZRYq0TQnPJjCYrtD5pj2XDZ_ITpGU4FA7HQvILuIlNAQ</recordid><startdate>20221022</startdate><enddate>20221022</enddate><creator>Christopoulou, Fenia</creator><creator>Lampouras, Gerasimos</creator><creator>Iacobacci, Ignacio</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20221022</creationdate><title>Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU</title><author>Christopoulou, Fenia ; Lampouras, Gerasimos ; Iacobacci, Ignacio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a679-b4c259c7e97ddd0d4c75abb6379d7ecd0cf74a6883fbf5a41db43eef88c795dc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computation and Language</topic><toplevel>online_resources</toplevel><creatorcontrib>Christopoulou, Fenia</creatorcontrib><creatorcontrib>Lampouras, Gerasimos</creatorcontrib><creatorcontrib>Iacobacci, Ignacio</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Christopoulou, Fenia</au><au>Lampouras, Gerasimos</au><au>Iacobacci, Ignacio</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU</atitle><date>2022-10-22</date><risdate>2022</risdate><abstract>Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data performance often via heuristic-oriented or task-agnostic difficulties. In this work, instead, we employ CL for NLU by taking advantage of training dynamics as difficulty metrics, i.e., statistics that measure the behavior of the model at hand on specific task-data instances during training and propose modifications of existing CL schedulers based on these statistics. Differently from existing works, we focus on evaluating models on in-distribution (ID), out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer datasets. We show across several NLU tasks that CL with training dynamics can result in better performance mostly on zero-shot cross-lingual transfer and OOD settings with improvements up by 8.5% in certain cases. Overall, experiments indicate that training dynamics can lead to better performing models with smoother training compared to other difficulty metrics while being 20% faster on average. In addition, through analysis we shed light on the correlations of task-specific versus task-agnostic metrics.</abstract><doi>10.48550/arxiv.2210.12499</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2210.12499
ispartof
issn
language eng
recordid cdi_arxiv_primary_2210_12499
source arXiv.org
subjects Computer Science - Computation and Language
title Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-11T20%3A28%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Training%20Dynamics%20for%20Curriculum%20Learning:%20A%20Study%20on%20Monolingual%20and%20Cross-lingual%20NLU&rft.au=Christopoulou,%20Fenia&rft.date=2022-10-22&rft_id=info:doi/10.48550/arxiv.2210.12499&rft_dat=%3Carxiv_GOX%3E2210_12499%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true