Time series prediction based on LSTM-attention-LSTM model

Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data fo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2023-01, Vol.11, p.1-1
Hauptverfasser: Wen, Xianyun, Li, Weibang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1
container_issue
container_start_page 1
container_title IEEE access
container_volume 11
creator Wen, Xianyun
Li, Weibang
description Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.
doi_str_mv 10.1109/ACCESS.2023.3276628
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_ACCESS_2023_3276628</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10124729</ieee_id><doaj_id>oai_doaj_org_article_a0f2861a315340bcbb4f985c015da50e</doaj_id><sourcerecordid>2818367212</sourcerecordid><originalsourceid>FETCH-LOGICAL-c409t-30a3fb0ff324c1574155a669b9e1a8087a8db4b89d751b8d9812399109a6d7f73</originalsourceid><addsrcrecordid>eNpNUE1LAzEQDaJgqf0FeljwvDUfm69jWeoHVDy0nkOySSSlbWqyPfjvzbpFOpeZecx7M_MAuEdwjhCUT4u2Xa7XcwwxmRPMGcPiCkwwYrImlLDri_oWzHLewhKiQJRPgNyEvauyS8Hl6picDV0f4qEyOjtblWK13rzXuu_dYcDroa320brdHbjxepfd7Jyn4PN5uWlf69XHy1u7WNVdA2VfE6iJN9B7gpsOUd4gSjVj0kiHtICCa2FNY4S0nCIjrBQIEynLY5pZ7jmZgrdR10a9VccU9jr9qKiD-gNi-lI69aHbOaWhx4IhTRAlDTSdMY2XgnYQUaspdEXrcdQ6pvh9crlX23hKh3K-wgIJwjgu26eAjFNdijkn5_-3IqgGy9VouRosV2fLC-thZAXn3AUD4YZjSX4BsOt5-w</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2818367212</pqid></control><display><type>article</type><title>Time series prediction based on LSTM-attention-LSTM model</title><source>IEEE Open Access Journals</source><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Wen, Xianyun ; Li, Weibang</creator><creatorcontrib>Wen, Xianyun ; Li, Weibang</creatorcontrib><description>Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2023.3276628</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>attention mechanisms ; Autoregressive processes ; Coders ; Data models ; Decoding ; encoder and decoder model ; Forecasting ; Logic gates ; long short-term memory networks ; Mathematical models ; Multivariate analysis ; Predictive models ; Sequences ; Time series ; Time series analysis ; Time series forecasting</subject><ispartof>IEEE access, 2023-01, Vol.11, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c409t-30a3fb0ff324c1574155a669b9e1a8087a8db4b89d751b8d9812399109a6d7f73</citedby><cites>FETCH-LOGICAL-c409t-30a3fb0ff324c1574155a669b9e1a8087a8db4b89d751b8d9812399109a6d7f73</cites><orcidid>0000-0002-0947-4806 ; 0009-0005-1719-5793</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10124729$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,864,2100,27632,27923,27924,54932</link.rule.ids></links><search><creatorcontrib>Wen, Xianyun</creatorcontrib><creatorcontrib>Li, Weibang</creatorcontrib><title>Time series prediction based on LSTM-attention-LSTM model</title><title>IEEE access</title><addtitle>Access</addtitle><description>Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.</description><subject>attention mechanisms</subject><subject>Autoregressive processes</subject><subject>Coders</subject><subject>Data models</subject><subject>Decoding</subject><subject>encoder and decoder model</subject><subject>Forecasting</subject><subject>Logic gates</subject><subject>long short-term memory networks</subject><subject>Mathematical models</subject><subject>Multivariate analysis</subject><subject>Predictive models</subject><subject>Sequences</subject><subject>Time series</subject><subject>Time series analysis</subject><subject>Time series forecasting</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpNUE1LAzEQDaJgqf0FeljwvDUfm69jWeoHVDy0nkOySSSlbWqyPfjvzbpFOpeZecx7M_MAuEdwjhCUT4u2Xa7XcwwxmRPMGcPiCkwwYrImlLDri_oWzHLewhKiQJRPgNyEvauyS8Hl6picDV0f4qEyOjtblWK13rzXuu_dYcDroa320brdHbjxepfd7Jyn4PN5uWlf69XHy1u7WNVdA2VfE6iJN9B7gpsOUd4gSjVj0kiHtICCa2FNY4S0nCIjrBQIEynLY5pZ7jmZgrdR10a9VccU9jr9qKiD-gNi-lI69aHbOaWhx4IhTRAlDTSdMY2XgnYQUaspdEXrcdQ6pvh9crlX23hKh3K-wgIJwjgu26eAjFNdijkn5_-3IqgGy9VouRosV2fLC-thZAXn3AUD4YZjSX4BsOt5-w</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Wen, Xianyun</creator><creator>Li, Weibang</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-0947-4806</orcidid><orcidid>https://orcid.org/0009-0005-1719-5793</orcidid></search><sort><creationdate>20230101</creationdate><title>Time series prediction based on LSTM-attention-LSTM model</title><author>Wen, Xianyun ; Li, Weibang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c409t-30a3fb0ff324c1574155a669b9e1a8087a8db4b89d751b8d9812399109a6d7f73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>attention mechanisms</topic><topic>Autoregressive processes</topic><topic>Coders</topic><topic>Data models</topic><topic>Decoding</topic><topic>encoder and decoder model</topic><topic>Forecasting</topic><topic>Logic gates</topic><topic>long short-term memory networks</topic><topic>Mathematical models</topic><topic>Multivariate analysis</topic><topic>Predictive models</topic><topic>Sequences</topic><topic>Time series</topic><topic>Time series analysis</topic><topic>Time series forecasting</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wen, Xianyun</creatorcontrib><creatorcontrib>Li, Weibang</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wen, Xianyun</au><au>Li, Weibang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Time series prediction based on LSTM-attention-LSTM model</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2023-01-01</date><risdate>2023</risdate><volume>11</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Time series forecasting uses data from the past periods of time to predict future information, which is of great significance in many applications. Existing time series forecasting methods still have problems such as low accuracy when dealing with some non-stationary multivariate time series data forecasting. Aiming at the shortcomings of existing methods, in this paper we propose a new time series forecasting model LSTM-attention-LSTM. The model uses two LSTM models as the encoder and decoder, and introduces an attention mechanism between the encoder and decoder. The model has two distinctive features: first, by using the attention mechanism to calculate the interrelationship between sequence data, it overcomes the disadvantage of the coder-and-decoder model in that the decoder cannot obtain sufficiently long input sequences; second, it is suitable for sequence forecasting with long time steps. In this paper we validate the proposed model based on several real data sets, and the results show that the LSTM-attention-LSTM model is more accurate than some currently dominant models in prediction. The experiment also assessed the effect of the attention mechanism at different time steps by varying the time step.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2023.3276628</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-0947-4806</orcidid><orcidid>https://orcid.org/0009-0005-1719-5793</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2023-01, Vol.11, p.1-1
issn 2169-3536
2169-3536
language eng
recordid cdi_crossref_primary_10_1109_ACCESS_2023_3276628
source IEEE Open Access Journals; DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals
subjects attention mechanisms
Autoregressive processes
Coders
Data models
Decoding
encoder and decoder model
Forecasting
Logic gates
long short-term memory networks
Mathematical models
Multivariate analysis
Predictive models
Sequences
Time series
Time series analysis
Time series forecasting
title Time series prediction based on LSTM-attention-LSTM model
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-10T12%3A34%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Time%20series%20prediction%20based%20on%20LSTM-attention-LSTM%20model&rft.jtitle=IEEE%20access&rft.au=Wen,%20Xianyun&rft.date=2023-01-01&rft.volume=11&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2023.3276628&rft_dat=%3Cproquest_cross%3E2818367212%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2818367212&rft_id=info:pmid/&rft_ieee_id=10124729&rft_doaj_id=oai_doaj_org_article_a0f2861a315340bcbb4f985c015da50e&rfr_iscdi=true