EA-LSTM: Evolutionary attention-based LSTM for time series prediction
Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feat...
Gespeichert in:
Veröffentlicht in: | Knowledge-based systems 2019-10, Vol.181, p.104785, Article 104785 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | 104785 |
container_title | Knowledge-based systems |
container_volume | 181 |
creator | Li, Youru Zhu, Zhenfeng Kong, Deqiang Han, Hua Zhao, Yao |
description | Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods. |
doi_str_mv | 10.1016/j.knosys.2019.05.028 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2292058349</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0950705119302400</els_id><sourcerecordid>2292058349</sourcerecordid><originalsourceid>FETCH-LOGICAL-c380t-9e5a7afb3b909b76630b535c5b792a26230e21829d4379b5667bb53dc13c3d343</originalsourceid><addsrcrecordid>eNp9kE9LxDAQxYMouK5-Aw8Fz62TpGkaD8Ky1D-w4sH1HJo0hdTdZk2yC357U-rZ0zDwe2_mPYRuMRQYcHU_FF-jCz-hIIBFAawAUp-hBa45yXkJ4hwtQDDIOTB8ia5CGACAEFwvUNOs8s3H9u0ha05ud4zWja3_ydoYzTgtuWqD6bIJyXrns2j3JgvGWxOygzed1RN1jS76dhfMzd9cos-nZrt-yTfvz6_r1SbXtIaYC8Na3vaKKgFC8aqioBhlmikuSEsqQsGkr4joSsqFYlXFVQI6jammHS3pEt3Nvgfvvo8mRDm4ox_TSUmIIMBqWopElTOlvQvBm14evN2nWBKDnAqTg5wLk1NhEphMhSXZ4ywzKcHJGi-DtmbUKaQ3OsrO2f8NfgHH9nR6</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2292058349</pqid></control><display><type>article</type><title>EA-LSTM: Evolutionary attention-based LSTM for time series prediction</title><source>Elsevier ScienceDirect Journals</source><creator>Li, Youru ; Zhu, Zhenfeng ; Kong, Deqiang ; Han, Hua ; Zhao, Yao</creator><creatorcontrib>Li, Youru ; Zhu, Zhenfeng ; Kong, Deqiang ; Han, Hua ; Zhao, Yao</creatorcontrib><description>Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods.</description><identifier>ISSN: 0950-7051</identifier><identifier>EISSN: 1872-7409</identifier><identifier>DOI: 10.1016/j.knosys.2019.05.028</identifier><language>eng</language><publisher>Amsterdam: Elsevier B.V</publisher><subject>Deep neural network ; Evolutionary computation ; Machine learning ; Neural networks ; Optimization ; Parameters ; Random search method ; Time series ; Time series prediction</subject><ispartof>Knowledge-based systems, 2019-10, Vol.181, p.104785, Article 104785</ispartof><rights>2019 Elsevier B.V.</rights><rights>Copyright Elsevier Science Ltd. Oct 1, 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c380t-9e5a7afb3b909b76630b535c5b792a26230e21829d4379b5667bb53dc13c3d343</citedby><cites>FETCH-LOGICAL-c380t-9e5a7afb3b909b76630b535c5b792a26230e21829d4379b5667bb53dc13c3d343</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.sciencedirect.com/science/article/pii/S0950705119302400$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,776,780,3537,27901,27902,65306</link.rule.ids></links><search><creatorcontrib>Li, Youru</creatorcontrib><creatorcontrib>Zhu, Zhenfeng</creatorcontrib><creatorcontrib>Kong, Deqiang</creatorcontrib><creatorcontrib>Han, Hua</creatorcontrib><creatorcontrib>Zhao, Yao</creatorcontrib><title>EA-LSTM: Evolutionary attention-based LSTM for time series prediction</title><title>Knowledge-based systems</title><description>Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods.</description><subject>Deep neural network</subject><subject>Evolutionary computation</subject><subject>Machine learning</subject><subject>Neural networks</subject><subject>Optimization</subject><subject>Parameters</subject><subject>Random search method</subject><subject>Time series</subject><subject>Time series prediction</subject><issn>0950-7051</issn><issn>1872-7409</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LxDAQxYMouK5-Aw8Fz62TpGkaD8Ky1D-w4sH1HJo0hdTdZk2yC357U-rZ0zDwe2_mPYRuMRQYcHU_FF-jCz-hIIBFAawAUp-hBa45yXkJ4hwtQDDIOTB8ia5CGACAEFwvUNOs8s3H9u0ha05ud4zWja3_ydoYzTgtuWqD6bIJyXrns2j3JgvGWxOygzed1RN1jS76dhfMzd9cos-nZrt-yTfvz6_r1SbXtIaYC8Na3vaKKgFC8aqioBhlmikuSEsqQsGkr4joSsqFYlXFVQI6jammHS3pEt3Nvgfvvo8mRDm4ox_TSUmIIMBqWopElTOlvQvBm14evN2nWBKDnAqTg5wLk1NhEphMhSXZ4ywzKcHJGi-DtmbUKaQ3OsrO2f8NfgHH9nR6</recordid><startdate>20191001</startdate><enddate>20191001</enddate><creator>Li, Youru</creator><creator>Zhu, Zhenfeng</creator><creator>Kong, Deqiang</creator><creator>Han, Hua</creator><creator>Zhao, Yao</creator><general>Elsevier B.V</general><general>Elsevier Science Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>E3H</scope><scope>F2A</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20191001</creationdate><title>EA-LSTM: Evolutionary attention-based LSTM for time series prediction</title><author>Li, Youru ; Zhu, Zhenfeng ; Kong, Deqiang ; Han, Hua ; Zhao, Yao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c380t-9e5a7afb3b909b76630b535c5b792a26230e21829d4379b5667bb53dc13c3d343</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Deep neural network</topic><topic>Evolutionary computation</topic><topic>Machine learning</topic><topic>Neural networks</topic><topic>Optimization</topic><topic>Parameters</topic><topic>Random search method</topic><topic>Time series</topic><topic>Time series prediction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Youru</creatorcontrib><creatorcontrib>Zhu, Zhenfeng</creatorcontrib><creatorcontrib>Kong, Deqiang</creatorcontrib><creatorcontrib>Han, Hua</creatorcontrib><creatorcontrib>Zhao, Yao</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>Library & Information Sciences Abstracts (LISA)</collection><collection>Library & Information Science Abstracts (LISA)</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Knowledge-based systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Youru</au><au>Zhu, Zhenfeng</au><au>Kong, Deqiang</au><au>Han, Hua</au><au>Zhao, Yao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>EA-LSTM: Evolutionary attention-based LSTM for time series prediction</atitle><jtitle>Knowledge-based systems</jtitle><date>2019-10-01</date><risdate>2019</risdate><volume>181</volume><spage>104785</spage><pages>104785-</pages><artnum>104785</artnum><issn>0950-7051</issn><eissn>1872-7409</eissn><abstract>Time series prediction with deep learning methods, especially Long Short-term Memory Neural Network (LSTM), have scored significant achievements in recent years. Despite the fact that LSTM can help to capture long-term dependencies, its ability to pay different degree of attention on sub-window feature within multiple time-steps is insufficient. To address this issue, an evolutionary attention-based LSTM training with competitive random search is proposed for multivariate time series prediction. By transferring shared parameters, an evolutionary attention learning approach is introduced to LSTM. Thus, like that for biological evolution, the pattern for importance-based attention sampling can be confirmed during temporal relationship mining. To refrain from being trapped into partial optimization like traditional gradient-based methods, an evolutionary computation inspired competitive random search method is proposed, which can well configure the parameters in the attention layer. Experimental results have illustrated that the proposed model can achieve competetive prediction performance compared with other baseline methods.</abstract><cop>Amsterdam</cop><pub>Elsevier B.V</pub><doi>10.1016/j.knosys.2019.05.028</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0950-7051 |
ispartof | Knowledge-based systems, 2019-10, Vol.181, p.104785, Article 104785 |
issn | 0950-7051 1872-7409 |
language | eng |
recordid | cdi_proquest_journals_2292058349 |
source | Elsevier ScienceDirect Journals |
subjects | Deep neural network Evolutionary computation Machine learning Neural networks Optimization Parameters Random search method Time series Time series prediction |
title | EA-LSTM: Evolutionary attention-based LSTM for time series prediction |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-01T20%3A47%3A52IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=EA-LSTM:%20Evolutionary%20attention-based%20LSTM%20for%20time%20series%20prediction&rft.jtitle=Knowledge-based%20systems&rft.au=Li,%20Youru&rft.date=2019-10-01&rft.volume=181&rft.spage=104785&rft.pages=104785-&rft.artnum=104785&rft.issn=0950-7051&rft.eissn=1872-7409&rft_id=info:doi/10.1016/j.knosys.2019.05.028&rft_dat=%3Cproquest_cross%3E2292058349%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2292058349&rft_id=info:pmid/&rft_els_id=S0950705119302400&rfr_iscdi=true |