Power management in hybrid electric vehicles using deep recurrent reinforcement learning

A power management framework for hybrid electric vehicles (HEVs) is proposed based on deep reinforcement learning (DRL) with a Long Short-Term Memory (LSTM) network to minimize the fuel consumption through determining the power distribution between the two propulsion sources, the internal combustion...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Electrical engineering 2022-06, Vol.104 (3), p.1459-1471
Hauptverfasser: Sun, Mengshu, Zhao, Pu, Lin, Xue
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1471
container_issue 3
container_start_page 1459
container_title Electrical engineering
container_volume 104
creator Sun, Mengshu
Zhao, Pu
Lin, Xue
description A power management framework for hybrid electric vehicles (HEVs) is proposed based on deep reinforcement learning (DRL) with a Long Short-Term Memory (LSTM) network to minimize the fuel consumption through determining the power distribution between the two propulsion sources, the internal combustion engine (ICE) and the electric motor (EM). DRL is effective for handling the high-dimensional state and action spaces in the HEV power management problem, and the LSTM structure leverages temporal dependencies of input information, providing internal state predictions automatically without introducing extra state variables. This technique is entirely online, meaning that the framework is constructed in real time during the training phase, independently of a prior knowledge of driving cycles. The learned information stored in the LSTM network is utilized efficiently, and the computational speed is enhanced by making multiple predictions simultaneously in each step. Simulation over various driving cycles demonstrates the efficacy of the proposed framework in fuel economy improvement.
doi_str_mv 10.1007/s00202-021-01401-7
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2665134974</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2665134974</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-4a47d4cee035b64853fc062f37bd1cd616704f3aa914f9a9fa17fd7db75d54463</originalsourceid><addsrcrecordid>eNp9kE1LAzEQhoMoWKt_wFPA82qyyW6aoxS_oKAHBW8hm0zaLdtsnXSV_ntTV_DmaWB4n3eGh5BLzq45Y-omMVaysmAlLxiXjBfqiEy4FHklZ-qYTJiWs0Lpkp-Ss5TWjDFRaTkh7y_9FyDd2GiXsIG4o22kq32DrafQgdth6-gnrFrXQaJDauOSeoAtRXAD4gFAaGPo0Y14BxZjTp2Tk2C7BBe_c0re7u9e54_F4vnhaX67KJzgeldIK5WXDiD_09RyVongWF0GoRrPna95rZgMwlrNZdBWB8tV8Mo3qvKVlLWYkquxd4v9xwBpZ9b9gDGfNGVdV1xIrWROlWPKYZ8SQjBbbDcW94YzczBoRoMmGzQ_Bo3KkBihlMNxCfhX_Q_1DTkOdK8</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2665134974</pqid></control><display><type>article</type><title>Power management in hybrid electric vehicles using deep recurrent reinforcement learning</title><source>SpringerLink Journals - AutoHoldings</source><creator>Sun, Mengshu ; Zhao, Pu ; Lin, Xue</creator><creatorcontrib>Sun, Mengshu ; Zhao, Pu ; Lin, Xue</creatorcontrib><description>A power management framework for hybrid electric vehicles (HEVs) is proposed based on deep reinforcement learning (DRL) with a Long Short-Term Memory (LSTM) network to minimize the fuel consumption through determining the power distribution between the two propulsion sources, the internal combustion engine (ICE) and the electric motor (EM). DRL is effective for handling the high-dimensional state and action spaces in the HEV power management problem, and the LSTM structure leverages temporal dependencies of input information, providing internal state predictions automatically without introducing extra state variables. This technique is entirely online, meaning that the framework is constructed in real time during the training phase, independently of a prior knowledge of driving cycles. The learned information stored in the LSTM network is utilized efficiently, and the computational speed is enhanced by making multiple predictions simultaneously in each step. Simulation over various driving cycles demonstrates the efficacy of the proposed framework in fuel economy improvement.</description><identifier>ISSN: 0948-7921</identifier><identifier>EISSN: 1432-0487</identifier><identifier>DOI: 10.1007/s00202-021-01401-7</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Deep learning ; Economics and Management ; Electric motors ; Electric power distribution ; Electric vehicles ; Electrical Engineering ; Electrical Machines and Networks ; Energy Policy ; Engineering ; Fuel consumption ; Fuel economy ; Hybrid electric vehicles ; Internal combustion engines ; Original Paper ; Power consumption ; Power Electronics ; Power management</subject><ispartof>Electrical engineering, 2022-06, Vol.104 (3), p.1459-1471</ispartof><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021</rights><rights>The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-4a47d4cee035b64853fc062f37bd1cd616704f3aa914f9a9fa17fd7db75d54463</citedby><cites>FETCH-LOGICAL-c319t-4a47d4cee035b64853fc062f37bd1cd616704f3aa914f9a9fa17fd7db75d54463</cites><orcidid>0000-0003-3540-1464</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00202-021-01401-7$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s00202-021-01401-7$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Sun, Mengshu</creatorcontrib><creatorcontrib>Zhao, Pu</creatorcontrib><creatorcontrib>Lin, Xue</creatorcontrib><title>Power management in hybrid electric vehicles using deep recurrent reinforcement learning</title><title>Electrical engineering</title><addtitle>Electr Eng</addtitle><description>A power management framework for hybrid electric vehicles (HEVs) is proposed based on deep reinforcement learning (DRL) with a Long Short-Term Memory (LSTM) network to minimize the fuel consumption through determining the power distribution between the two propulsion sources, the internal combustion engine (ICE) and the electric motor (EM). DRL is effective for handling the high-dimensional state and action spaces in the HEV power management problem, and the LSTM structure leverages temporal dependencies of input information, providing internal state predictions automatically without introducing extra state variables. This technique is entirely online, meaning that the framework is constructed in real time during the training phase, independently of a prior knowledge of driving cycles. The learned information stored in the LSTM network is utilized efficiently, and the computational speed is enhanced by making multiple predictions simultaneously in each step. Simulation over various driving cycles demonstrates the efficacy of the proposed framework in fuel economy improvement.</description><subject>Deep learning</subject><subject>Economics and Management</subject><subject>Electric motors</subject><subject>Electric power distribution</subject><subject>Electric vehicles</subject><subject>Electrical Engineering</subject><subject>Electrical Machines and Networks</subject><subject>Energy Policy</subject><subject>Engineering</subject><subject>Fuel consumption</subject><subject>Fuel economy</subject><subject>Hybrid electric vehicles</subject><subject>Internal combustion engines</subject><subject>Original Paper</subject><subject>Power consumption</subject><subject>Power Electronics</subject><subject>Power management</subject><issn>0948-7921</issn><issn>1432-0487</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kE1LAzEQhoMoWKt_wFPA82qyyW6aoxS_oKAHBW8hm0zaLdtsnXSV_ntTV_DmaWB4n3eGh5BLzq45Y-omMVaysmAlLxiXjBfqiEy4FHklZ-qYTJiWs0Lpkp-Ss5TWjDFRaTkh7y_9FyDd2GiXsIG4o22kq32DrafQgdth6-gnrFrXQaJDauOSeoAtRXAD4gFAaGPo0Y14BxZjTp2Tk2C7BBe_c0re7u9e54_F4vnhaX67KJzgeldIK5WXDiD_09RyVongWF0GoRrPna95rZgMwlrNZdBWB8tV8Mo3qvKVlLWYkquxd4v9xwBpZ9b9gDGfNGVdV1xIrWROlWPKYZ8SQjBbbDcW94YzczBoRoMmGzQ_Bo3KkBihlMNxCfhX_Q_1DTkOdK8</recordid><startdate>20220601</startdate><enddate>20220601</enddate><creator>Sun, Mengshu</creator><creator>Zhao, Pu</creator><creator>Lin, Xue</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-3540-1464</orcidid></search><sort><creationdate>20220601</creationdate><title>Power management in hybrid electric vehicles using deep recurrent reinforcement learning</title><author>Sun, Mengshu ; Zhao, Pu ; Lin, Xue</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-4a47d4cee035b64853fc062f37bd1cd616704f3aa914f9a9fa17fd7db75d54463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Deep learning</topic><topic>Economics and Management</topic><topic>Electric motors</topic><topic>Electric power distribution</topic><topic>Electric vehicles</topic><topic>Electrical Engineering</topic><topic>Electrical Machines and Networks</topic><topic>Energy Policy</topic><topic>Engineering</topic><topic>Fuel consumption</topic><topic>Fuel economy</topic><topic>Hybrid electric vehicles</topic><topic>Internal combustion engines</topic><topic>Original Paper</topic><topic>Power consumption</topic><topic>Power Electronics</topic><topic>Power management</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Sun, Mengshu</creatorcontrib><creatorcontrib>Zhao, Pu</creatorcontrib><creatorcontrib>Lin, Xue</creatorcontrib><collection>CrossRef</collection><jtitle>Electrical engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Sun, Mengshu</au><au>Zhao, Pu</au><au>Lin, Xue</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Power management in hybrid electric vehicles using deep recurrent reinforcement learning</atitle><jtitle>Electrical engineering</jtitle><stitle>Electr Eng</stitle><date>2022-06-01</date><risdate>2022</risdate><volume>104</volume><issue>3</issue><spage>1459</spage><epage>1471</epage><pages>1459-1471</pages><issn>0948-7921</issn><eissn>1432-0487</eissn><abstract>A power management framework for hybrid electric vehicles (HEVs) is proposed based on deep reinforcement learning (DRL) with a Long Short-Term Memory (LSTM) network to minimize the fuel consumption through determining the power distribution between the two propulsion sources, the internal combustion engine (ICE) and the electric motor (EM). DRL is effective for handling the high-dimensional state and action spaces in the HEV power management problem, and the LSTM structure leverages temporal dependencies of input information, providing internal state predictions automatically without introducing extra state variables. This technique is entirely online, meaning that the framework is constructed in real time during the training phase, independently of a prior knowledge of driving cycles. The learned information stored in the LSTM network is utilized efficiently, and the computational speed is enhanced by making multiple predictions simultaneously in each step. Simulation over various driving cycles demonstrates the efficacy of the proposed framework in fuel economy improvement.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s00202-021-01401-7</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-3540-1464</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0948-7921
ispartof Electrical engineering, 2022-06, Vol.104 (3), p.1459-1471
issn 0948-7921
1432-0487
language eng
recordid cdi_proquest_journals_2665134974
source SpringerLink Journals - AutoHoldings
subjects Deep learning
Economics and Management
Electric motors
Electric power distribution
Electric vehicles
Electrical Engineering
Electrical Machines and Networks
Energy Policy
Engineering
Fuel consumption
Fuel economy
Hybrid electric vehicles
Internal combustion engines
Original Paper
Power consumption
Power Electronics
Power management
title Power management in hybrid electric vehicles using deep recurrent reinforcement learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T10%3A49%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Power%20management%20in%20hybrid%20electric%20vehicles%20using%20deep%20recurrent%20reinforcement%20learning&rft.jtitle=Electrical%20engineering&rft.au=Sun,%20Mengshu&rft.date=2022-06-01&rft.volume=104&rft.issue=3&rft.spage=1459&rft.epage=1471&rft.pages=1459-1471&rft.issn=0948-7921&rft.eissn=1432-0487&rft_id=info:doi/10.1007/s00202-021-01401-7&rft_dat=%3Cproquest_cross%3E2665134974%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2665134974&rft_id=info:pmid/&rfr_iscdi=true