Recent Advances of Multimodal Continual Learning: A Comprehensive Survey
Continual learning (CL) aims to empower machine learning models to learn continually from new data, while building upon previously acquired knowledge without forgetting. As machine learning models have evolved from small to large pre-trained architectures, and from supporting unimodal to multimodal...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Continual learning (CL) aims to empower machine learning models to learn
continually from new data, while building upon previously acquired knowledge
without forgetting. As machine learning models have evolved from small to large
pre-trained architectures, and from supporting unimodal to multimodal data,
multimodal continual learning (MMCL) methods have recently emerged. The primary
challenge of MMCL is that it goes beyond a simple stacking of unimodal CL
methods, as such straightforward approaches often yield unsatisfactory
performance. In this work, we present the first comprehensive survey on MMCL.
We provide essential background knowledge and MMCL settings, as well as a
structured taxonomy of MMCL methods. We categorize existing MMCL methods into
four categories, i.e., regularization-based, architecture-based, replay-based,
and prompt-based methods, explaining their methodologies and highlighting their
key innovations. Additionally, to prompt further research in this field, we
summarize open MMCL datasets and benchmarks, and discuss several promising
future directions for investigation and development. We have also created a
GitHub repository for indexing relevant MMCL papers and open resources
available at https://github.com/LucyDYu/Awesome-Multimodal-Continual-Learning. |
---|---|
DOI: | 10.48550/arxiv.2410.05352 |