Privacy-Preserving and Communication-Efficient Energy Prediction Scheme Based on Federated Learning for Smart Grids

Energy forecasting is important because it enables infrastructure planning and power dispatching while reducing power outages and equipment failures. It is well-known that federated learning (FL) can be used to build a global energy predictor for smart grids without revealing the customers' raw...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE internet of things journal 2023-05, Vol.10 (9), p.1-1
Hauptverfasser: Badr, Mahmoud M., Mahmoud, Mohamed, Fang, Yuguang, Abdulaal, Mohammed, Aljohani, Abdulah J., Alasmary, Waleed, Ibrahem, Mohamed I.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Energy forecasting is important because it enables infrastructure planning and power dispatching while reducing power outages and equipment failures. It is well-known that federated learning (FL) can be used to build a global energy predictor for smart grids without revealing the customers' raw data to preserve privacy. However, it still reveals local models' parameters during the training process, which may still leak customers' data privacy. In addition, for the global model to converge, it requires multiple training rounds, which must be done in a communication-efficient way. Moreover, most existing works only focus on load forecasting while neglecting energy forecasting in net-metering systems. To address these limitations, in this paper, we propose a privacy-preserving and communication-efficient FL-based energy predictor for net-metering systems. Based on a dataset for real power consumption/generation readings, we first propose a multi-data-source hybrid deep learning (DL)-based predictor to accurately predict future readings. Then, we repurpose an efficient inner-product functional encryption (IPFE) scheme for implementing secure data aggregation to preserve the customers' privacy by encrypting their models' parameters during the FL training. To address communication efficiency, we use a change and transmit (CAT) approach to update local model's parameters, where only the parameters with sufficient changes are updated. Our extensive studies demonstrate that our approach accurately predicts future readings while providing privacy protection and high communication efficiency.
ISSN:2327-4662
2327-4662
DOI:10.1109/JIOT.2022.3230586