Think Locally, Act Globally: Federated Learning with Local and Global Representations
Federated learning is a method of training models on private data distributed over multiple devices. To keep device data private, the global model is trained by only communicating parameters and updates which poses scalability challenges for large models. To this end, we propose a new federated lear...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated learning is a method of training models on private data distributed
over multiple devices. To keep device data private, the global model is trained
by only communicating parameters and updates which poses scalability challenges
for large models. To this end, we propose a new federated learning algorithm
that jointly learns compact local representations on each device and a global
model across all devices. As a result, the global model can be smaller since it
only operates on local representations, reducing the number of communicated
parameters. Theoretically, we provide a generalization analysis which shows
that a combination of local and global models reduces both variance in the data
as well as variance across device distributions. Empirically, we demonstrate
that local models enable communication-efficient training while retaining
performance. We also evaluate on the task of personalized mood prediction from
real-world mobile data where privacy is key. Finally, local models handle
heterogeneous data from new devices, and learn fair representations that
obfuscate protected attributes such as race, age, and gender. |
---|---|
DOI: | 10.48550/arxiv.2001.01523 |