Active Federated Learning
Federated Learning allows for population level models to be trained without centralizing client data by transmitting the global model to clients, calculating gradients locally, then averaging the gradients. Downloading models and uploading gradients uses the client's bandwidth, so minimizing th...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated Learning allows for population level models to be trained without
centralizing client data by transmitting the global model to clients,
calculating gradients locally, then averaging the gradients. Downloading models
and uploading gradients uses the client's bandwidth, so minimizing these
transmission costs is important. The data on each client is highly variable, so
the benefit of training on different clients may differ dramatically. To
exploit this we propose Active Federated Learning, where in each round clients
are selected not uniformly at random, but with a probability conditioned on the
current model and the data on the client to maximize efficiency. We propose a
cheap, simple and intuitive sampling scheme which reduces the number of
required training iterations by 20-70% while maintaining the same model
accuracy, and which mimics well known resampling techniques under certain
conditions. |
---|---|
DOI: | 10.48550/arxiv.1909.12641 |