Reinforcement Learning-Enhanced Adaptive Scheduling of Battery Energy Storage Systems in Energy Markets

Battery Energy Storage Systems (BESSs) play a vital role in modern power grids by optimally dispatching energy according to the price signal. This paper proposes a reinforcement learning-based model that optimizes BESS scheduling with the proposed Q-learning algorithm combined with an epsilon-greedy...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Energies (Basel) 2024-11, Vol.17 (21), p.5425
Hauptverfasser: Liu, Yang, Lu, Qiuyu, Yu, Zhenfan, Chen, Yue, Yang, Yinguo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Battery Energy Storage Systems (BESSs) play a vital role in modern power grids by optimally dispatching energy according to the price signal. This paper proposes a reinforcement learning-based model that optimizes BESS scheduling with the proposed Q-learning algorithm combined with an epsilon-greedy strategy. The proposed epsilon-greedy strategy-based Q-learning algorithm can efficiently manage energy dispatching under uncertain price signals and multi-day operations without retraining. Simulations are conducted under different scenarios, considering electricity price fluctuations and battery aging conditions. Results show that the proposed algorithm demonstrates enhanced economic returns and adaptability compared to traditional methods, providing a practical solution for intelligent BESS scheduling that supports grid stability and the efficient use of renewable energy.
ISSN:1996-1073
1996-1073
DOI:10.3390/en17215425