Adaptive Federated Learning With Non-IID Data

Abstract With the widespread use of Internet of things(IoT) devices, it generates an enormous volume of data, and it is a challenge to mine the IoT data value while ensuring security and privacy. Federated learning is a decentralized approach for training data located on edge devices, such as mobile...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computer journal 2023-11, Vol.66 (11), p.2758-2772
Hauptverfasser: Zeng, Yan, Mu, Yuankai, Yuan, Junfeng, Teng, Siyuan, Zhang, Jilin, Wan, Jian, Ren, Yongjian, Zhang, Yunquan
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Abstract With the widespread use of Internet of things(IoT) devices, it generates an enormous volume of data, and it is a challenge to mine the IoT data value while ensuring security and privacy. Federated learning is a decentralized approach for training data located on edge devices, such as mobile phones and IoT devices, while keeping privacy, efficiency, and security. However, the Non-IID (non-independent and identically distributed) data, always greatly impacts the performance of the global model. In this paper, we propose a FedDynamic algorithm to solve the statistical challenge of federated learning caused by Non-IID. As Non-IID data can lead to significant differences in model parameters between edge devices, we set different weights for different devices during model aggregation to get a high-performance global model. We analyze and exact key indices (local model accuracy, local data quality, and model difference between local models and the global model), which can reflect the quality of the model, and calculate the aggregation weight for edge devices based on the key indices. Furthermore, we dynamically adjust aggregation weight based on accuracy’s variety to solve weight staleness during the training process. Experiments on the MNIST, FMNIST, EMNIST, CINIC-10 and CIFAR-10 datasets show that the FedDynamic algorithm has better accuracy and convergence performance, compared to the FedAvg, FedProx and Scaffold algorithms.
ISSN:0010-4620
1460-2067
DOI:10.1093/comjnl/bxac118