Probabilistic load forecasting with a non-crossing sparse-group Lasso-quantile regression deep neural network

In this paper, a non-crossing sparse-group Lasso-quantile regression deep neural network (SGLQRDNN) model is proposed to address electricity load forecasting. Different from the traditional deep learning for point forecasting, the SGLQRDNN model realizes the probability density forecasting of the lo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Energy (Oxford) 2022-03, Vol.242, p.122955, Article 122955
Hauptverfasser: Lu, Shixiang, Xu, Qifa, Jiang, Cuixia, Liu, Yezheng, Kusiak, Andrew
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, a non-crossing sparse-group Lasso-quantile regression deep neural network (SGLQRDNN) model is proposed to address electricity load forecasting. Different from the traditional deep learning for point forecasting, the SGLQRDNN model realizes the probability density forecasting of the load. SGLQRDNN modelling integrates two strategies to alleviate the dilemma of quantile crossing and structural complexity. The SGLQRDNN model mitigates the deficiency of quantile crossing by a joint estimation of non-crossing constraints. It also realizes the shrinkage of the network and the selection of critical features with the sparse-group Lasso algorithm. The proposed model is trained and tested using the residential daily electricity consumption data. The experimental results show that SGLQRDNN has advantages in interpretability, sparsity, and performance criteria. Specifically, the monotonicity of its internal quantiles is 4.18%–9.96% higher than that of the unconstrained model. Compared with three sparse regularization networks, SGLQRDNN can shrink 88.47% of the connection weights and 19.32% of neurons. Meanwhile, its performance improvement ranges from 5.76% to 18.28%. Additionally, its training speed is 2.73–7.01 times faster than the model trained on individual quantiles. Finally, two non-parametric tests verify that SGLQRDNN significantly outperforms the comparison models at the 10% level. •The SGLQRDNN model is developed for deep learning under the framework of QR.•SGLQRDNN mitigates the deficiency of quantile crossing with non-crossing constraints.•SGLQRDNN shrinks the network with the sparse-group Lasso algorithm.•SGLQRDNNhas been applied to probabilistic forecasting for real industrial data.•The superiority of SGLQRDNN is illustrated through extensive experiments.
ISSN:0360-5442
1873-6785
DOI:10.1016/j.energy.2021.122955