Training Dynamics for Curriculum Learning: A Study on Monolingual and Cross-lingual NLU
Curriculum Learning (CL) is a technique of training models via ranking examples in a typically increasing difficulty trend with the aim of accelerating convergence and improving generalisability. Current approaches for Natural Language Understanding (NLU) tasks use CL to improve in-distribution data...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Curriculum Learning (CL) is a technique of training models via ranking
examples in a typically increasing difficulty trend with the aim of
accelerating convergence and improving generalisability. Current approaches for
Natural Language Understanding (NLU) tasks use CL to improve in-distribution
data performance often via heuristic-oriented or task-agnostic difficulties. In
this work, instead, we employ CL for NLU by taking advantage of training
dynamics as difficulty metrics, i.e., statistics that measure the behavior of
the model at hand on specific task-data instances during training and propose
modifications of existing CL schedulers based on these statistics. Differently
from existing works, we focus on evaluating models on in-distribution (ID),
out-of-distribution (OOD) as well as zero-shot (ZS) cross-lingual transfer
datasets. We show across several NLU tasks that CL with training dynamics can
result in better performance mostly on zero-shot cross-lingual transfer and OOD
settings with improvements up by 8.5% in certain cases. Overall, experiments
indicate that training dynamics can lead to better performing models with
smoother training compared to other difficulty metrics while being 20% faster
on average. In addition, through analysis we shed light on the correlations of
task-specific versus task-agnostic metrics. |
---|---|
DOI: | 10.48550/arxiv.2210.12499 |