FedSTS: A Stratified Client Selection Framework for Consistently Fast Federated Learning

In this article, we investigate random client selection in the context of horizontal federated learning (FL), whereby only a randomly selected subset of clients transmit their model updates to the server instead of yielding all clients involved. Many researchers have demonstrated that clustering-bas...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transaction on neural networks and learning systems 2024-09, Vol.PP, p.1-15
Hauptverfasser: Gao, Dehong, Song, Duanxiao, Shen, Guangyuan, Cai, Xiaoyan, Yang, Libin, Liu, Gongshen, Li, Xiaoyong, Wang, Zhen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this article, we investigate random client selection in the context of horizontal federated learning (FL), whereby only a randomly selected subset of clients transmit their model updates to the server instead of yielding all clients involved. Many researchers have demonstrated that clustering-based client selection constitutes a simple yet efficacious approach to the identification of those clients possessing representative gradient information. Despite the extensive body of research on modified selection methodologies, the majority of prior work is predicated upon the assumption of consistently effective clustering. However, raw gradient-based clustering methods are subject to several challenges: 1) poor effectiveness, the raw high-dimensional gradient of a client is too complex to serve as an appropriate feature for grouping, resulting in large intra-cluster distances and 2) fluctuating effectiveness, due to inherent limitations in clustering, the effectiveness can vary significantly, leading to clusters with diverse levels of heterogeneity. In practice, suboptimal and inconsistent clustering effects can result in clusters with low intra-cluster similarity among clients. The selection of clients from such clusters may impede the overall convergence of training. In this article, we propose, a novel client selection scheme to accelerate the FL convergence by variance reduction. The main idea of is to stratify a compressed model update in order to ensure an excellent grouping effect, and at the same time reduce the cross-client variance by re-allocating the sample chance among different groups based on their diverse heterogeneity. It strikes this convergence acceleration by paying more attention to those client groups with relatively low similarity and then improving the representativeness of the selected subset as much as possible. Theoretically, we demonstrate the critical improvement of the proposed scheme in variance reduction and present equivalence conditions among different client selection methods. We also present the tighter convergence guarantee of the proposed method thanks to the variance reduction. Experimental results confirm the exceeded efficiency of our approach compared to alternatives.
ISSN:2162-237X
2162-2388
2162-2388
DOI:10.1109/TNNLS.2024.3438843