A Unified Framework for Continual Learning and Unlearning
Continual learning and machine unlearning are crucial challenges in machine learning, typically addressed separately. Continual learning focuses on adapting to new knowledge while preserving past information, whereas unlearning involves selectively forgetting specific subsets of data. In this paper,...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Continual learning and machine unlearning are crucial challenges in machine
learning, typically addressed separately. Continual learning focuses on
adapting to new knowledge while preserving past information, whereas unlearning
involves selectively forgetting specific subsets of data. In this paper, we
introduce a new framework that jointly tackles both tasks by leveraging
controlled knowledge distillation. Our approach enables efficient learning with
minimal forgetting and effective targeted unlearning. By incorporating a fixed
memory buffer, the system supports learning new concepts while retaining prior
knowledge. The distillation process is carefully managed to ensure a balance
between acquiring new information and forgetting specific data as needed.
Experimental results on benchmark datasets show that our method matches or
exceeds the performance of existing approaches in both continual learning and
machine unlearning. This unified framework is the first to address both
challenges simultaneously, paving the way for adaptable models capable of
dynamic learning and forgetting while maintaining strong overall performance.
Source code: \textcolor{blue}{https://respailab.github.io/CLMUL} |
---|---|
DOI: | 10.48550/arxiv.2408.11374 |