Client-Side Optimization Strategies for Communication-Efficient Federated Learning
Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applicati...
Gespeichert in:
Veröffentlicht in: | IEEE communications magazine 2022-07, Vol.60 (7), p.60-66 |
---|---|
Hauptverfasser: | , , |
Format: | Magazinearticle |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications with sensitive data such as healthcare, finance, and social media. However, there are barriers to real-world FL at the wireless network edge, stemming from massive wireless parallelism and the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous distribution of data across clients, and some cutting-edge works attempt to address this problem using novel client-side optimization strategies. In this article, we provide a tutorial on model training in FL, and survey the recent developments in client-side optimization and how they relate to the communication properties of FL. We then perform a set of comparison experiments on a representative subset of these strategies, gaining insights into their communication-convergence trade-offs. Finally, we highlight challenges to client-side optimization and provide suggestions for future developments in FL at the wireless edge. |
---|---|
ISSN: | 0163-6804 1558-1896 |
DOI: | 10.1109/MCOM.005.210108 |