Forget-SVGD: Particle-Based Bayesian Federated Unlearning
Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques. This paper proposes to leverage the flexibility of non-parametric Bayesian approximate inference to develop a novel Bayesian federated unlearnin...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Variational particle-based Bayesian learning methods have the advantage of
not being limited by the bias affecting more conventional parametric
techniques. This paper proposes to leverage the flexibility of non-parametric
Bayesian approximate inference to develop a novel Bayesian federated unlearning
method, referred to as Forget-Stein Variational Gradient Descent (Forget-SVGD).
Forget-SVGD builds on SVGD - a particle-based approximate Bayesian inference
scheme using gradient-based deterministic updates - and on its distributed
(federated) extension known as Distributed SVGD (DSVGD). Upon the completion of
federated learning, as one or more participating agents request for their data
to be "forgotten", Forget-SVGD carries out local SVGD updates at the agents
whose data need to be "unlearned", which are interleaved with communication
rounds with a parameter server. The proposed method is validated via
performance comparisons with non-parametric schemes that train from scratch by
excluding data to be forgotten, as well as with existing parametric Bayesian
unlearning methods. |
---|---|
DOI: | 10.48550/arxiv.2111.12056 |