Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning

A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle (EV) charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various en...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.65956-65966
Hauptverfasser: Asha Rani, G. S., Lal Priya, P. S., Jayan, Jino, Satheesh, Rahul, Kolhe, Mohan Lal
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 65966
container_issue
container_start_page 65956
container_title IEEE access
container_volume 12
creator Asha Rani, G. S.
Lal Priya, P. S.
Jayan, Jino
Satheesh, Rahul
Kolhe, Mohan Lal
description A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle (EV) charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various energy sources and EVs. This research work aims to develop an Energy Management System (EMS) for an EV charging station (EVCS) that minimizes the operating cost of the EVCS operator while meeting the energy demands of connected EVs. The proposed approach employs a model-free method leveraging Deep Reinforcement Learning (DRL) to identify optimal schedules of connected EVs in real time. A Markov Decision Process (MDP) model is constructed from the perspective of the EVCS operator. The real-world scenarios are formulated by considering the stochastic nature of renewable energy and the commuting behavior of EVs. Various DRL algorithms for addressing MDPs are examined, and their performances are empirically compared. Notably, the Truncated Quantile Critics (TQC) algorithm emerges as the superior choice, yielding enhanced model performance. The simulation findings show that the proposed EMS can offer an enhanced control strategy, reducing the charging cost for EVCS operators compared to other benchmark methods.
doi_str_mv 10.1109/ACCESS.2024.3398059
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_ACCESS_2024_3398059</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10522634</ieee_id><doaj_id>oai_doaj_org_article_3f2fa501423f404ea980ea360f1fada7</doaj_id><sourcerecordid>3055168847</sourcerecordid><originalsourceid>FETCH-LOGICAL-c359t-125b73c8146a9a486749e15754a80ff509c518e51ca313aa86821e9f1437b3cf3</originalsourceid><addsrcrecordid>eNpNUcFO3DAQjapWAlG-oBws9ZzF9tiJfUTZbUHaqlK39GoNZhy8WuytEyrx9yQEVfgynud5b2b8quqL4CshuL286rrNbreSXKoVgDVc2w_VqRSNrUFD8_Hd_aQ6H4Y9n46ZIN2eVv0aR6zXJf6jxDaJSv_MfmDCnh4pjSwHhhN-ID-W6Nkfeoj-QKx7wNLH1LPdiGPMid0Oc7YmOrJfFFPIxS8CW8KSprfP1aeAh4HO3-JZdftt87u7rrc_v990V9vag7ZjLaS-a8EboRq0qEzTKktCt1qh4SFobr0WhrTwCAIQTWOkIBuEgvYOfICz6mbRvc-4d8cSH7E8u4zRvQK59A7LOC_hIMiAmgslISiuCKevI4SGBxHwHttJ6-uidSz57xMNo9vnp5Km8R1wrUVjjJqrYKnyJQ9DofC_q-BuNsgtBrnZIPdm0MS6WFiRiN4xtJQNKHgBZzKLGA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3055168847</pqid></control><display><type>article</type><title>Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Asha Rani, G. S. ; Lal Priya, P. S. ; Jayan, Jino ; Satheesh, Rahul ; Kolhe, Mohan Lal</creator><creatorcontrib>Asha Rani, G. S. ; Lal Priya, P. S. ; Jayan, Jino ; Satheesh, Rahul ; Kolhe, Mohan Lal</creatorcontrib><description>A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle (EV) charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various energy sources and EVs. This research work aims to develop an Energy Management System (EMS) for an EV charging station (EVCS) that minimizes the operating cost of the EVCS operator while meeting the energy demands of connected EVs. The proposed approach employs a model-free method leveraging Deep Reinforcement Learning (DRL) to identify optimal schedules of connected EVs in real time. A Markov Decision Process (MDP) model is constructed from the perspective of the EVCS operator. The real-world scenarios are formulated by considering the stochastic nature of renewable energy and the commuting behavior of EVs. Various DRL algorithms for addressing MDPs are examined, and their performances are empirically compared. Notably, the Truncated Quantile Critics (TQC) algorithm emerges as the superior choice, yielding enhanced model performance. The simulation findings show that the proposed EMS can offer an enhanced control strategy, reducing the charging cost for EVCS operators compared to other benchmark methods.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2024.3398059</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Algorithms ; Alternative energy sources ; Charging stations ; Costs ; Decision making ; Deep learning ; Deep reinforcement learning ; electric vehicle ; Electric vehicle charging ; Electric vehicle charging stations ; Electric vehicles ; Energy management ; energy management strategy ; Energy resources ; Markov decision process ; Markov processes ; Operating costs ; renewable energy ; Renewable energy sources ; Renewable resources ; truncated quantile critics ; Uncertainty</subject><ispartof>IEEE access, 2024, Vol.12, p.65956-65966</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c359t-125b73c8146a9a486749e15754a80ff509c518e51ca313aa86821e9f1437b3cf3</cites><orcidid>0009-0001-4836-3391 ; 0000-0001-7547-9413 ; 0009-0003-9578-1945 ; 0000-0002-6004-9784</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10522634$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2100,4022,27632,27922,27923,27924,54932</link.rule.ids></links><search><creatorcontrib>Asha Rani, G. S.</creatorcontrib><creatorcontrib>Lal Priya, P. S.</creatorcontrib><creatorcontrib>Jayan, Jino</creatorcontrib><creatorcontrib>Satheesh, Rahul</creatorcontrib><creatorcontrib>Kolhe, Mohan Lal</creatorcontrib><title>Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning</title><title>IEEE access</title><addtitle>Access</addtitle><description>A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle (EV) charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various energy sources and EVs. This research work aims to develop an Energy Management System (EMS) for an EV charging station (EVCS) that minimizes the operating cost of the EVCS operator while meeting the energy demands of connected EVs. The proposed approach employs a model-free method leveraging Deep Reinforcement Learning (DRL) to identify optimal schedules of connected EVs in real time. A Markov Decision Process (MDP) model is constructed from the perspective of the EVCS operator. The real-world scenarios are formulated by considering the stochastic nature of renewable energy and the commuting behavior of EVs. Various DRL algorithms for addressing MDPs are examined, and their performances are empirically compared. Notably, the Truncated Quantile Critics (TQC) algorithm emerges as the superior choice, yielding enhanced model performance. The simulation findings show that the proposed EMS can offer an enhanced control strategy, reducing the charging cost for EVCS operators compared to other benchmark methods.</description><subject>Algorithms</subject><subject>Alternative energy sources</subject><subject>Charging stations</subject><subject>Costs</subject><subject>Decision making</subject><subject>Deep learning</subject><subject>Deep reinforcement learning</subject><subject>electric vehicle</subject><subject>Electric vehicle charging</subject><subject>Electric vehicle charging stations</subject><subject>Electric vehicles</subject><subject>Energy management</subject><subject>energy management strategy</subject><subject>Energy resources</subject><subject>Markov decision process</subject><subject>Markov processes</subject><subject>Operating costs</subject><subject>renewable energy</subject><subject>Renewable energy sources</subject><subject>Renewable resources</subject><subject>truncated quantile critics</subject><subject>Uncertainty</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUcFO3DAQjapWAlG-oBws9ZzF9tiJfUTZbUHaqlK39GoNZhy8WuytEyrx9yQEVfgynud5b2b8quqL4CshuL286rrNbreSXKoVgDVc2w_VqRSNrUFD8_Hd_aQ6H4Y9n46ZIN2eVv0aR6zXJf6jxDaJSv_MfmDCnh4pjSwHhhN-ID-W6Nkfeoj-QKx7wNLH1LPdiGPMid0Oc7YmOrJfFFPIxS8CW8KSprfP1aeAh4HO3-JZdftt87u7rrc_v990V9vag7ZjLaS-a8EboRq0qEzTKktCt1qh4SFobr0WhrTwCAIQTWOkIBuEgvYOfICz6mbRvc-4d8cSH7E8u4zRvQK59A7LOC_hIMiAmgslISiuCKevI4SGBxHwHttJ6-uidSz57xMNo9vnp5Km8R1wrUVjjJqrYKnyJQ9DofC_q-BuNsgtBrnZIPdm0MS6WFiRiN4xtJQNKHgBZzKLGA</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Asha Rani, G. S.</creator><creator>Lal Priya, P. S.</creator><creator>Jayan, Jino</creator><creator>Satheesh, Rahul</creator><creator>Kolhe, Mohan Lal</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0009-0001-4836-3391</orcidid><orcidid>https://orcid.org/0000-0001-7547-9413</orcidid><orcidid>https://orcid.org/0009-0003-9578-1945</orcidid><orcidid>https://orcid.org/0000-0002-6004-9784</orcidid></search><sort><creationdate>2024</creationdate><title>Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning</title><author>Asha Rani, G. S. ; Lal Priya, P. S. ; Jayan, Jino ; Satheesh, Rahul ; Kolhe, Mohan Lal</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c359t-125b73c8146a9a486749e15754a80ff509c518e51ca313aa86821e9f1437b3cf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Alternative energy sources</topic><topic>Charging stations</topic><topic>Costs</topic><topic>Decision making</topic><topic>Deep learning</topic><topic>Deep reinforcement learning</topic><topic>electric vehicle</topic><topic>Electric vehicle charging</topic><topic>Electric vehicle charging stations</topic><topic>Electric vehicles</topic><topic>Energy management</topic><topic>energy management strategy</topic><topic>Energy resources</topic><topic>Markov decision process</topic><topic>Markov processes</topic><topic>Operating costs</topic><topic>renewable energy</topic><topic>Renewable energy sources</topic><topic>Renewable resources</topic><topic>truncated quantile critics</topic><topic>Uncertainty</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Asha Rani, G. S.</creatorcontrib><creatorcontrib>Lal Priya, P. S.</creatorcontrib><creatorcontrib>Jayan, Jino</creatorcontrib><creatorcontrib>Satheesh, Rahul</creatorcontrib><creatorcontrib>Kolhe, Mohan Lal</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Asha Rani, G. S.</au><au>Lal Priya, P. S.</au><au>Jayan, Jino</au><au>Satheesh, Rahul</au><au>Kolhe, Mohan Lal</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2024</date><risdate>2024</risdate><volume>12</volume><spage>65956</spage><epage>65966</epage><pages>65956-65966</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>A charging station that integrates renewable energy sources is a promising solution to address the increasing demand for electric vehicle (EV) charging without expanding the distribution network. An efficient and flexible energy management strategy is essential for effectively integrating various energy sources and EVs. This research work aims to develop an Energy Management System (EMS) for an EV charging station (EVCS) that minimizes the operating cost of the EVCS operator while meeting the energy demands of connected EVs. The proposed approach employs a model-free method leveraging Deep Reinforcement Learning (DRL) to identify optimal schedules of connected EVs in real time. A Markov Decision Process (MDP) model is constructed from the perspective of the EVCS operator. The real-world scenarios are formulated by considering the stochastic nature of renewable energy and the commuting behavior of EVs. Various DRL algorithms for addressing MDPs are examined, and their performances are empirically compared. Notably, the Truncated Quantile Critics (TQC) algorithm emerges as the superior choice, yielding enhanced model performance. The simulation findings show that the proposed EMS can offer an enhanced control strategy, reducing the charging cost for EVCS operators compared to other benchmark methods.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2024.3398059</doi><tpages>11</tpages><orcidid>https://orcid.org/0009-0001-4836-3391</orcidid><orcidid>https://orcid.org/0000-0001-7547-9413</orcidid><orcidid>https://orcid.org/0009-0003-9578-1945</orcidid><orcidid>https://orcid.org/0000-0002-6004-9784</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2024, Vol.12, p.65956-65966
issn 2169-3536
2169-3536
language eng
recordid cdi_crossref_primary_10_1109_ACCESS_2024_3398059
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Algorithms
Alternative energy sources
Charging stations
Costs
Decision making
Deep learning
Deep reinforcement learning
electric vehicle
Electric vehicle charging
Electric vehicle charging stations
Electric vehicles
Energy management
energy management strategy
Energy resources
Markov decision process
Markov processes
Operating costs
renewable energy
Renewable energy sources
Renewable resources
truncated quantile critics
Uncertainty
title Data-Driven Energy Management of an Electric Vehicle Charging Station Using Deep Reinforcement Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T19%3A37%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Data-Driven%20Energy%20Management%20of%20an%20Electric%20Vehicle%20Charging%20Station%20Using%20Deep%20Reinforcement%20Learning&rft.jtitle=IEEE%20access&rft.au=Asha%20Rani,%20G.%20S.&rft.date=2024&rft.volume=12&rft.spage=65956&rft.epage=65966&rft.pages=65956-65966&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2024.3398059&rft_dat=%3Cproquest_cross%3E3055168847%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3055168847&rft_id=info:pmid/&rft_ieee_id=10522634&rft_doaj_id=oai_doaj_org_article_3f2fa501423f404ea980ea360f1fada7&rfr_iscdi=true