Federated Learning over Connected Modes
Statistical heterogeneity in federated learning poses two major challenges: slow global training due to conflicting gradient signals, and the need of personalization for local distributions. In this work, we tackle both challenges by leveraging recent advances in \emph{linear mode connectivity} -- i...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Statistical heterogeneity in federated learning poses two major challenges:
slow global training due to conflicting gradient signals, and the need of
personalization for local distributions. In this work, we tackle both
challenges by leveraging recent advances in \emph{linear mode connectivity} --
identifying a linearly connected low-loss region in the parameter space of
neural networks, which we call solution simplex. We propose federated learning
over connected modes (\textsc{Floco}), where clients are assigned local
subregions in this simplex based on their gradient signals, and together learn
the shared global solution simplex. This allows personalization of the client
models to fit their local distributions within the degrees of freedom in the
solution simplex and homogenizes the update signals for the global simplex
training. Our experiments show that \textsc{Floco} accelerates the global
training process, and significantly improves the local accuracy with minimal
computational overhead in cross-silo federated learning settings. |
---|---|
DOI: | 10.48550/arxiv.2403.03333 |