Optimized Deep Stacked Long Short-Term Memory Network for Long-Term Load Forecasting

Long-term load forecasting (LTLF) is an essential process for strategical planning of the future needed extension in the power systems of any country. Besides, deep learning has become the heart of the machine learning paradigm, which is wildly used nowadays in many fields, and it also has become th...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2021, Vol.9, p.68511-68522
Hauptverfasser: Farrag, Tamer Ahmed, Elattar, Ehab E.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Long-term load forecasting (LTLF) is an essential process for strategical planning of the future needed extension in the power systems of any country. Besides, deep learning has become the heart of the machine learning paradigm, which is wildly used nowadays in many fields, and it also has become the current revolution in Artificial Intelligence (AI). In this paper, an optimized deep learning model based on Stacked Long Short-Term Memory Network (SLSTMN) is proposed. The architecture of the model is optimized to get the best configuration using Genetic Algorithm (GA). In addition, the hyperparameters of the model network are optimized using many deep learning techniques. During the optimization process, hundreds of model configurations are tested. The accuracy of this model is compared with many deep learning models and is compared against the related work in the field of LTLF. The dataset of the South Australia State (SA) power system is used to test the compared models. This data includes maximum daily load, daily maximum temperature, daily minimum temperature, weekday, the month, and holidays for 12 years from 2005 to 2016. SLSTMN achieves excellent accuracy and the lowest error value (almost 1%) when compared with other models on the same dataset and with related work models on different datasets.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3077275