Local differentially private federated learning with homomorphic encryption

Federated learning (FL) is an emerging distributed machine learning paradigm without revealing private local data for privacy-preserving. However, there are still limitations. On one hand, user’ privacy can be deduced from local outputs. On the other hand, privacy, efficiency, and accuracy are hard...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Journal of supercomputing 2023-11, Vol.79 (17), p.19365-19395
Hauptverfasser: Zhao, Jianzhe, Huang, Chenxi, Wang, Wenji, Xie, Rulin, Dong, Rongrong, Matwin, Stan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Federated learning (FL) is an emerging distributed machine learning paradigm without revealing private local data for privacy-preserving. However, there are still limitations. On one hand, user’ privacy can be deduced from local outputs. On the other hand, privacy, efficiency, and accuracy are hard to fulfill for conflicting goals. To tackle these problems, we propose a novel privacy-preserving FL (HEFL-LDP) algorithm, which integrates semi-homomorphic encryption and local differential privacy. With the reduction of computational and communication burden, HEFL-LDP resists model inversion attacks and membership inference attacks from a server or malicious client. Moreover, a new utility optimization strategy with accuracy-oriented privacy parameter adjustment and model shuffling is proposed to solve the problem of accuracy decline. The security and cost of the algorithm are verified through theoretical analysis and proof. Comprehensive experimental evaluations on the MNIST dataset and CIFAR-10 dataset demonstrate that HEFL-LDP significantly reduces the privacy budget and outperforms existing algorithms in computational cost and accuracy.
ISSN:0920-8542
1573-0484
DOI:10.1007/s11227-023-05378-x