Specialized federated learning using a mixture of experts
In federated learning, clients share a global model that has been trained on decentralized local client data. Although federated learning shows significant promise as a key approach when data cannot be shared or centralized, current methods show limited privacy properties and have shortcomings when...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In federated learning, clients share a global model that has been trained on
decentralized local client data. Although federated learning shows significant
promise as a key approach when data cannot be shared or centralized, current
methods show limited privacy properties and have shortcomings when applied to
common real-world scenarios, especially when client data is heterogeneous. In
this paper, we propose an alternative method to learn a personalized model for
each client in a federated setting, with greater generalization abilities than
previous methods. To achieve this personalization we propose a federated
learning framework using a mixture of experts to combine the specialist nature
of a locally trained model with the generalist knowledge of a global model. We
evaluate our method on a variety of datasets with different levels of data
heterogeneity, and our results show that the mixture of experts model is better
suited as a personalized model for devices in these settings, outperforming
both fine-tuned global models and local specialists. |
---|---|
DOI: | 10.48550/arxiv.2010.02056 |