Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model

Integrated Energy Microgrid (IEM) has emerged as a critical energy utilization mechanism for alleviating environmental and economic pressures. As a part of demand-side energy prediction, multi-energy load forecasting is a vital precondition for the planning and operation scheduling of IEM. In order...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sustainability 2022-10, Vol.14 (19), p.12843
Hauptverfasser: Zhang, Ge, Zhu, Songyang, Bai, Xiaoqing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 19
container_start_page 12843
container_title Sustainability
container_volume 14
creator Zhang, Ge
Zhu, Songyang
Bai, Xiaoqing
description Integrated Energy Microgrid (IEM) has emerged as a critical energy utilization mechanism for alleviating environmental and economic pressures. As a part of demand-side energy prediction, multi-energy load forecasting is a vital precondition for the planning and operation scheduling of IEM. In order to increase data diversity and improve model generalization while protecting data privacy, this paper proposes a method that uses the CNN-Attention-LSTM model based on federated learning to forecast the multi-energy load of IEMs. CNN-Attention-LSTM is the global model for extracting features. Federated learning (FL) helps IEMs to train a forecasting model in a distributed manner without sharing local data. This paper examines the individual, central, and federated models with four federated learning strategies (FedAvg, FedAdagrad, FedYogi, and FedAdam). Moreover, considering that FL uses communication technology, the impact of false data injection attacks (FDIA) is also investigated. The results show that federated models can achieve an accuracy comparable to the central model while having a higher precision than individual models, and FedAdagrad has the best prediction performance. Furthermore, FedAdagrad can maintain stability when attacked by false data injection.
doi_str_mv 10.3390/su141912843
format Article
fullrecord <record><control><sourceid>gale_proqu</sourceid><recordid>TN_cdi_proquest_journals_2724321515</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A747159118</galeid><sourcerecordid>A747159118</sourcerecordid><originalsourceid>FETCH-LOGICAL-c371t-156fd9fb602ae7d355f21489734f2e6a79c5afb9bf110fc91f61608f4bcd1fe63</originalsourceid><addsrcrecordid>eNpVkU1Lw0AQhoMoWLQn_0DAk0jqTjYf3WMtrRaSCra9CWGTnY0pabbubsD-e7fUQztzmHmH5505jOc9ABlRysiL6SECBuE4olfeICQpBEBicn3W33pDY7bEBaUOTQbe1xwFam5R-Bly3TVdHbxy42Tet7YJZh3q-uBnigt_rjRW3FjH-DnabyX8jTmK6XIZTKzFzjaqC7LVOvdzJbC9924kbw0O_-udt5nP1tP3IPt4W0wnWVDRFGwAcSIFk2VCQo6poHEsQ4jGLKWRDDHhKatiLktWSgAiKwYygYSMZVRWAiQm9M57PO3da_XTo7HFVvW6cyeLMA0jGkIMsaNGJ6rmLRZNJ5XVvHIpcNdUqkPZuPkkjVKIGcDYGZ4uDI6x-Gtr3htTLFafl-zzia20MkajLPa62XF9KIAUx_cUZ--hfwBqf7I</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2724321515</pqid></control><display><type>article</type><title>Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model</title><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><source>MDPI - Multidisciplinary Digital Publishing Institute</source><creator>Zhang, Ge ; Zhu, Songyang ; Bai, Xiaoqing</creator><creatorcontrib>Zhang, Ge ; Zhu, Songyang ; Bai, Xiaoqing</creatorcontrib><description>Integrated Energy Microgrid (IEM) has emerged as a critical energy utilization mechanism for alleviating environmental and economic pressures. As a part of demand-side energy prediction, multi-energy load forecasting is a vital precondition for the planning and operation scheduling of IEM. In order to increase data diversity and improve model generalization while protecting data privacy, this paper proposes a method that uses the CNN-Attention-LSTM model based on federated learning to forecast the multi-energy load of IEMs. CNN-Attention-LSTM is the global model for extracting features. Federated learning (FL) helps IEMs to train a forecasting model in a distributed manner without sharing local data. This paper examines the individual, central, and federated models with four federated learning strategies (FedAvg, FedAdagrad, FedYogi, and FedAdam). Moreover, considering that FL uses communication technology, the impact of false data injection attacks (FDIA) is also investigated. The results show that federated models can achieve an accuracy comparable to the central model while having a higher precision than individual models, and FedAdagrad has the best prediction performance. Furthermore, FedAdagrad can maintain stability when attacked by false data injection.</description><identifier>ISSN: 2071-1050</identifier><identifier>EISSN: 2071-1050</identifier><identifier>DOI: 10.3390/su141912843</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Accuracy ; Algorithms ; Analysis ; Artificial intelligence ; Computational linguistics ; Data security ; Distributed generation ; Economic forecasting ; Energy industry ; Energy utilization ; Forecasting ; Injection ; Language processing ; Learning ; Learning strategies ; Liu E ; Machine learning ; Methods ; Natural language interfaces ; Neural networks ; Operation scheduling ; Optimization ; Privacy ; Sustainability</subject><ispartof>Sustainability, 2022-10, Vol.14 (19), p.12843</ispartof><rights>COPYRIGHT 2022 MDPI AG</rights><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c371t-156fd9fb602ae7d355f21489734f2e6a79c5afb9bf110fc91f61608f4bcd1fe63</citedby><cites>FETCH-LOGICAL-c371t-156fd9fb602ae7d355f21489734f2e6a79c5afb9bf110fc91f61608f4bcd1fe63</cites><orcidid>0000-0002-4383-3892</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Zhang, Ge</creatorcontrib><creatorcontrib>Zhu, Songyang</creatorcontrib><creatorcontrib>Bai, Xiaoqing</creatorcontrib><title>Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model</title><title>Sustainability</title><description>Integrated Energy Microgrid (IEM) has emerged as a critical energy utilization mechanism for alleviating environmental and economic pressures. As a part of demand-side energy prediction, multi-energy load forecasting is a vital precondition for the planning and operation scheduling of IEM. In order to increase data diversity and improve model generalization while protecting data privacy, this paper proposes a method that uses the CNN-Attention-LSTM model based on federated learning to forecast the multi-energy load of IEMs. CNN-Attention-LSTM is the global model for extracting features. Federated learning (FL) helps IEMs to train a forecasting model in a distributed manner without sharing local data. This paper examines the individual, central, and federated models with four federated learning strategies (FedAvg, FedAdagrad, FedYogi, and FedAdam). Moreover, considering that FL uses communication technology, the impact of false data injection attacks (FDIA) is also investigated. The results show that federated models can achieve an accuracy comparable to the central model while having a higher precision than individual models, and FedAdagrad has the best prediction performance. Furthermore, FedAdagrad can maintain stability when attacked by false data injection.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Analysis</subject><subject>Artificial intelligence</subject><subject>Computational linguistics</subject><subject>Data security</subject><subject>Distributed generation</subject><subject>Economic forecasting</subject><subject>Energy industry</subject><subject>Energy utilization</subject><subject>Forecasting</subject><subject>Injection</subject><subject>Language processing</subject><subject>Learning</subject><subject>Learning strategies</subject><subject>Liu E</subject><subject>Machine learning</subject><subject>Methods</subject><subject>Natural language interfaces</subject><subject>Neural networks</subject><subject>Operation scheduling</subject><subject>Optimization</subject><subject>Privacy</subject><subject>Sustainability</subject><issn>2071-1050</issn><issn>2071-1050</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNpVkU1Lw0AQhoMoWLQn_0DAk0jqTjYf3WMtrRaSCra9CWGTnY0pabbubsD-e7fUQztzmHmH5505jOc9ABlRysiL6SECBuE4olfeICQpBEBicn3W33pDY7bEBaUOTQbe1xwFam5R-Bly3TVdHbxy42Tet7YJZh3q-uBnigt_rjRW3FjH-DnabyX8jTmK6XIZTKzFzjaqC7LVOvdzJbC9924kbw0O_-udt5nP1tP3IPt4W0wnWVDRFGwAcSIFk2VCQo6poHEsQ4jGLKWRDDHhKatiLktWSgAiKwYygYSMZVRWAiQm9M57PO3da_XTo7HFVvW6cyeLMA0jGkIMsaNGJ6rmLRZNJ5XVvHIpcNdUqkPZuPkkjVKIGcDYGZ4uDI6x-Gtr3htTLFafl-zzia20MkajLPa62XF9KIAUx_cUZ--hfwBqf7I</recordid><startdate>20221001</startdate><enddate>20221001</enddate><creator>Zhang, Ge</creator><creator>Zhu, Songyang</creator><creator>Bai, Xiaoqing</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>ISR</scope><scope>4U-</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0002-4383-3892</orcidid></search><sort><creationdate>20221001</creationdate><title>Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model</title><author>Zhang, Ge ; Zhu, Songyang ; Bai, Xiaoqing</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c371t-156fd9fb602ae7d355f21489734f2e6a79c5afb9bf110fc91f61608f4bcd1fe63</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Analysis</topic><topic>Artificial intelligence</topic><topic>Computational linguistics</topic><topic>Data security</topic><topic>Distributed generation</topic><topic>Economic forecasting</topic><topic>Energy industry</topic><topic>Energy utilization</topic><topic>Forecasting</topic><topic>Injection</topic><topic>Language processing</topic><topic>Learning</topic><topic>Learning strategies</topic><topic>Liu E</topic><topic>Machine learning</topic><topic>Methods</topic><topic>Natural language interfaces</topic><topic>Neural networks</topic><topic>Operation scheduling</topic><topic>Optimization</topic><topic>Privacy</topic><topic>Sustainability</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Ge</creatorcontrib><creatorcontrib>Zhu, Songyang</creatorcontrib><creatorcontrib>Bai, Xiaoqing</creatorcontrib><collection>CrossRef</collection><collection>Gale In Context: Science</collection><collection>University Readers</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Sustainability</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Ge</au><au>Zhu, Songyang</au><au>Bai, Xiaoqing</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model</atitle><jtitle>Sustainability</jtitle><date>2022-10-01</date><risdate>2022</risdate><volume>14</volume><issue>19</issue><spage>12843</spage><pages>12843-</pages><issn>2071-1050</issn><eissn>2071-1050</eissn><abstract>Integrated Energy Microgrid (IEM) has emerged as a critical energy utilization mechanism for alleviating environmental and economic pressures. As a part of demand-side energy prediction, multi-energy load forecasting is a vital precondition for the planning and operation scheduling of IEM. In order to increase data diversity and improve model generalization while protecting data privacy, this paper proposes a method that uses the CNN-Attention-LSTM model based on federated learning to forecast the multi-energy load of IEMs. CNN-Attention-LSTM is the global model for extracting features. Federated learning (FL) helps IEMs to train a forecasting model in a distributed manner without sharing local data. This paper examines the individual, central, and federated models with four federated learning strategies (FedAvg, FedAdagrad, FedYogi, and FedAdam). Moreover, considering that FL uses communication technology, the impact of false data injection attacks (FDIA) is also investigated. The results show that federated models can achieve an accuracy comparable to the central model while having a higher precision than individual models, and FedAdagrad has the best prediction performance. Furthermore, FedAdagrad can maintain stability when attacked by false data injection.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/su141912843</doi><orcidid>https://orcid.org/0000-0002-4383-3892</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2071-1050
ispartof Sustainability, 2022-10, Vol.14 (19), p.12843
issn 2071-1050
2071-1050
language eng
recordid cdi_proquest_journals_2724321515
source Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals; MDPI - Multidisciplinary Digital Publishing Institute
subjects Accuracy
Algorithms
Analysis
Artificial intelligence
Computational linguistics
Data security
Distributed generation
Economic forecasting
Energy industry
Energy utilization
Forecasting
Injection
Language processing
Learning
Learning strategies
Liu E
Machine learning
Methods
Natural language interfaces
Neural networks
Operation scheduling
Optimization
Privacy
Sustainability
title Federated Learning-Based Multi-Energy Load Forecasting Method Using CNN-Attention-LSTM Model
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T16%3A14%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_proqu&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Federated%20Learning-Based%20Multi-Energy%20Load%20Forecasting%20Method%20Using%20CNN-Attention-LSTM%20Model&rft.jtitle=Sustainability&rft.au=Zhang,%20Ge&rft.date=2022-10-01&rft.volume=14&rft.issue=19&rft.spage=12843&rft.pages=12843-&rft.issn=2071-1050&rft.eissn=2071-1050&rft_id=info:doi/10.3390/su141912843&rft_dat=%3Cgale_proqu%3EA747159118%3C/gale_proqu%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2724321515&rft_id=info:pmid/&rft_galeid=A747159118&rfr_iscdi=true