Securing Federated Learning: Approaches, Mechanisms and Opportunities
With the ability to analyze data, artificial intelligence technology and its offshoots have made difficult tasks easier. The tools of these technologies are now used in almost every aspect of life. For example, Machine Learning (ML), an offshoot of artificial intelligence, has become the focus of in...
Gespeichert in:
Veröffentlicht in: | Electronics (Basel) 2024-09, Vol.13 (18), p.3675 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | With the ability to analyze data, artificial intelligence technology and its offshoots have made difficult tasks easier. The tools of these technologies are now used in almost every aspect of life. For example, Machine Learning (ML), an offshoot of artificial intelligence, has become the focus of interest for researchers in industry, education, healthcare and other disciplines and has proven to be as efficient as, and in some cases better than, experts in answering various problems. However, the obstacles to ML’s progress are still being explored, and Federated Learning (FL) has been presented as a solution to the problems of privacy and confidentiality. In the FL approach, users do not disclose their data throughout the learning process, which improves privacy and security. In this article, we look at the security and privacy concepts of FL and the threats and attacks it faces. We also address the security measures used in FL aggregation procedures. In addition, we examine and discuss the use of homomorphic encryption to protect FL data exchange, as well as other security strategies. Finally, we discuss security and privacy concepts in FL and what additional improvements could be made in this context to increase the efficiency of FL algorithms. |
---|---|
ISSN: | 2079-9292 2079-9292 |
DOI: | 10.3390/electronics13183675 |