Essentials for Class Incremental Learning
Contemporary neural networks are limited in their ability to learn from evolving streams of training data. When trained sequentially on new or evolving tasks, their accuracy drops sharply, making them unsuitable for many real-world applications. In this work, we shed light on the causes of this well...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Contemporary neural networks are limited in their ability to learn from
evolving streams of training data. When trained sequentially on new or evolving
tasks, their accuracy drops sharply, making them unsuitable for many real-world
applications. In this work, we shed light on the causes of this well-known yet
unsolved phenomenon - often referred to as catastrophic forgetting - in a
class-incremental setup. We show that a combination of simple components and a
loss that balances intra-task and inter-task learning can already resolve
forgetting to the same extent as more complex measures proposed in literature.
Moreover, we identify poor quality of the learned representation as another
reason for catastrophic forgetting in class-IL. We show that performance is
correlated with secondary class information (dark knowledge) learned by the
model and it can be improved by an appropriate regularizer. With these lessons
learned, class-incremental learning results on CIFAR-100 and ImageNet improve
over the state-of-the-art by a large margin, while keeping the approach simple. |
---|---|
DOI: | 10.48550/arxiv.2102.09517 |