Lifelong Learning and Selective Forgetting via Contrastive Strategy
Lifelong learning aims to train a model with good performance for new tasks while retaining the capacity of previous tasks. However, some practical scenarios require the system to forget undesirable knowledge due to privacy issues, which is called selective forgetting. The joint task of the two is d...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Lifelong learning aims to train a model with good performance for new tasks
while retaining the capacity of previous tasks. However, some practical
scenarios require the system to forget undesirable knowledge due to privacy
issues, which is called selective forgetting. The joint task of the two is
dubbed Learning with Selective Forgetting (LSF). In this paper, we propose a
new framework based on contrastive strategy for LSF. Specifically, for the
preserved classes (tasks), we make features extracted from different samples
within a same class compacted. And for the deleted classes, we make the
features from different samples of a same class dispersed and irregular, i.e.,
the network does not have any regular response to samples from a specific
deleted class as if the network has no training at all. Through maintaining or
disturbing the feature distribution, the forgetting and memory of different
classes can be or independent of each other. Experiments are conducted on four
benchmark datasets, and our method acieves new state-of-the-art. |
---|---|
DOI: | 10.48550/arxiv.2405.18663 |