Can Bad Teaching Induce Forgetting? Unlearning in Deep Networks using an Incompetent Teacher
Machine unlearning has become an important area of research due to an increasing need for machine learning (ML) applications to comply with the emerging data privacy regulations. It facilitates the provision for removal of certain set or class of data from an already trained ML model without requiri...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Machine unlearning has become an important area of research due to an
increasing need for machine learning (ML) applications to comply with the
emerging data privacy regulations. It facilitates the provision for removal of
certain set or class of data from an already trained ML model without requiring
retraining from scratch. Recently, several efforts have been put in to make
unlearning to be effective and efficient. We propose a novel machine unlearning
method by exploring the utility of competent and incompetent teachers in a
student-teacher framework to induce forgetfulness. The knowledge from the
competent and incompetent teachers is selectively transferred to the student to
obtain a model that doesn't contain any information about the forget data. We
experimentally show that this method generalizes well, is fast and effective.
Furthermore, we introduce the zero retrain forgetting (ZRF) metric to evaluate
any unlearning method. Unlike the existing unlearning metrics, the ZRF score
does not depend on the availability of the expensive retrained model. This
makes it useful for analysis of the unlearned model after deployment as well.
We present results of experiments conducted for random subset forgetting and
class forgetting on various deep networks and across different application
domains.~Source code is at:
https://github.com/vikram2000b/bad-teaching-unlearning |
---|---|
DOI: | 10.48550/arxiv.2205.08096 |