Towards Robustness and Diversity: Continual Learning in Dialog Generation with Text-Mixup and Batch Nuclear-Norm Maximization
In our dynamic world where data arrives in a continuous stream, continual learning enables us to incrementally add new tasks/domains without the need to retrain from scratch. A major challenge in continual learning of language model is catastrophic forgetting, the tendency of models to forget knowle...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In our dynamic world where data arrives in a continuous stream, continual
learning enables us to incrementally add new tasks/domains without the need to
retrain from scratch. A major challenge in continual learning of language model
is catastrophic forgetting, the tendency of models to forget knowledge from
previously trained tasks/domains when training on new ones. This paper studies
dialog generation under the continual learning setting. We propose a novel
method that 1) uses \textit{Text-Mixup} as data augmentation to avoid model
overfitting on replay memory and 2) leverages Batch-Nuclear Norm Maximization
(BNNM) to alleviate the problem of mode collapse. Experiments on a $37$-domain
task-oriented dialog dataset and DailyDialog (a $10$-domain chitchat dataset)
demonstrate that our proposed approach outperforms the state-of-the-art in
continual learning. |
---|---|
DOI: | 10.48550/arxiv.2403.10894 |