Learning a Multi-Agent Controller for Shared Energy Storage System
Deployment of shared energy storage systems (SESS) allows users to use the stored energy to meet their own energy demands while saving energy costs without installing private energy storage equipment. In this paper, we consider a group of building users in the community with SESS, and each user can...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deployment of shared energy storage systems (SESS) allows users to use the
stored energy to meet their own energy demands while saving energy costs
without installing private energy storage equipment. In this paper, we consider
a group of building users in the community with SESS, and each user can
schedule power injection from the grid as well as SESS according to their
demand and real-time electricity price to minimize energy cost and meet energy
demand simultaneously. SESS is encouraged to charge when the price is low, thus
providing as much energy as possible for users while achieving energy savings.
However, due to the complex dynamics of buildings and real-time external
signals, it is a challenging task to find high-performance power dispatch
decisions in real-time. By designing a multi-agent reinforcement learning
framework with state-aware reward functions, SESS and users can realize power
scheduling to meet the users' energy demand and SESS's charging/discharging
balance without additional communication, so as to achieve energy optimization.
Compared with the baseline approach without the participation of the SESS, the
energy cost is saved by around 2.37% to 21.58%. |
---|---|
DOI: | 10.48550/arxiv.2302.08328 |