Federated Unlearning with Gradient Descent and Conflict Mitigation
Federated Learning (FL) has received much attention in recent years. However, although clients are not required to share their data in FL, the global model itself can implicitly remember clients' local data. Therefore, it's necessary to effectively remove the target client's data from...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated Learning (FL) has received much attention in recent years. However,
although clients are not required to share their data in FL, the global model
itself can implicitly remember clients' local data. Therefore, it's necessary
to effectively remove the target client's data from the FL global model to ease
the risk of privacy leakage and implement ``the right to be forgotten".
Federated Unlearning (FU) has been considered a promising way to remove data
without full retraining. But the model utility easily suffers significant
reduction during unlearning due to the gradient conflicts. Furthermore, when
conducting the post-training to recover the model utility, the model is prone
to move back and revert what has already been unlearned. To address these
issues, we propose Federated Unlearning with Orthogonal Steepest Descent
(FedOSD). We first design an unlearning Cross-Entropy loss to overcome the
convergence issue of the gradient ascent. A steepest descent direction for
unlearning is then calculated in the condition of being non-conflicting with
other clients' gradients and closest to the target client's gradient. This
benefits to efficiently unlearn and mitigate the model utility reduction. After
unlearning, we recover the model utility by maintaining the achievement of
unlearning. Finally, extensive experiments in several FL scenarios verify that
FedOSD outperforms the SOTA FU algorithms in terms of unlearning and model
utility. |
---|---|
DOI: | 10.48550/arxiv.2412.20200 |