Federated Learning Under Intermittent Client Availability and Time-Varying Communication Constraints

Federated learning systems facilitate the training of global models across large numbers of distributed edge-devices with potentially heterogeneous data. Such systems operate in resource constrained settings with intermittent client availability and/or time-varying communication constraints. As a re...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of selected topics in signal processing 2023-01, Vol.17 (1), p.1-17
Hauptverfasser: Ribero, Monica, Vikalo, Haris, Veciana, Gustavo de
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Federated learning systems facilitate the training of global models across large numbers of distributed edge-devices with potentially heterogeneous data. Such systems operate in resource constrained settings with intermittent client availability and/or time-varying communication constraints. As a result, the global models trained by federated learning systems may be biased towards clients with higher availability. We propose F ederated A veraging A ided by an A daptive S ampling T echnique ( F3ast ), an unbiased algorithm that dynamically learns an availability-dependent client selection strategy which asymptotically minimizes the impact of client-sampling variance on the global model's convergence, enhancing performance of federated learning. The proposed algorithm is tested in a variety of settings for intermittently available clients operating under communication constraints, and its efficacy demonstrated on synthetic data and realistically federated benchmarking experiments using CIFAR100 and Shakespeare datasets. We report up to 186% and 8% accuracy improvements over FedAvg , and 8% and 7% over FedAdam on CIFAR100 and Shakespeare, respectively.
ISSN:1932-4553
1941-0484
DOI:10.1109/JSTSP.2022.3224590