SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting

In the burgeoning ecosystem of Internet of Things, multivariate time series (MTS) data has become ubiquitous, highlighting the fundamental role of time series forecasting across numerous applications. The crucial challenge of long-term MTS forecasting requires adept models capable of capturing both...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-11
Hauptverfasser: Zhang, Zhenwei, Meng, Linghang, Gu, Yuantao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Zhang, Zhenwei
Meng, Linghang
Gu, Yuantao
description In the burgeoning ecosystem of Internet of Things, multivariate time series (MTS) data has become ubiquitous, highlighting the fundamental role of time series forecasting across numerous applications. The crucial challenge of long-term MTS forecasting requires adept models capable of capturing both intra- and inter-series dependencies. Recent advancements in deep learning, notably Transformers, have shown promise. However, many prevailing methods either marginalize inter-series dependencies or overlook them entirely. To bridge this gap, this paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of such dependencies. At the heart of this framework lies our specific implementation: the SageFormer. As a Series-aware Graph-enhanced Transformer model, SageFormer proficiently discerns and models the intricate relationships between series using graph structures. Beyond capturing diverse temporal patterns, it also curtails redundant information across series. Notably, the series-aware framework seamlessly integrates with existing Transformer-based models, enriching their ability to comprehend inter-series relationships. Extensive experiments on real-world and synthetic datasets validate the superior performance of SageFormer against contemporary state-of-the-art approaches.
doi_str_mv 10.48550/arxiv.2307.01616
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2307_01616</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2833805997</sourcerecordid><originalsourceid>FETCH-LOGICAL-a956-f2bd24d35ac7e694944e3cda0686ec75b3de68c51794437720e50add65e06eed3</originalsourceid><addsrcrecordid>eNotj09PwjAYxhsTEwnyATzZxPNm167t5o0QpyYzHsDz8rK-kCJb8d0A_fZO4PQcnj95fozdJSJOM63FI9CPP8RSCRuLxCTmio2kUkmUpVLesEnXbYQQ0liptRqxzzmssQjUID3xOZLHLpoegZAXBA0eA33xVSBehnYd9UgNf99ve38A8tAjX_gGLzU-rGANXe_b9S27XsG2w8lFx2xRPC9mr1H58fI2m5YR5NpEK7l0MnVKQ23R5GmepqhqB8JkBmurl8qhyWqd2MFR1kqBWoBzRqMwiE6N2f159sRc7cg3QL_VP3t1Yh8SD-fEjsL3Hru-2oQ9tcOnSmZKZULnuVV_XD9cgg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2833805997</pqid></control><display><type>article</type><title>SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Zhang, Zhenwei ; Meng, Linghang ; Gu, Yuantao</creator><creatorcontrib>Zhang, Zhenwei ; Meng, Linghang ; Gu, Yuantao</creatorcontrib><description>In the burgeoning ecosystem of Internet of Things, multivariate time series (MTS) data has become ubiquitous, highlighting the fundamental role of time series forecasting across numerous applications. The crucial challenge of long-term MTS forecasting requires adept models capable of capturing both intra- and inter-series dependencies. Recent advancements in deep learning, notably Transformers, have shown promise. However, many prevailing methods either marginalize inter-series dependencies or overlook them entirely. To bridge this gap, this paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of such dependencies. At the heart of this framework lies our specific implementation: the SageFormer. As a Series-aware Graph-enhanced Transformer model, SageFormer proficiently discerns and models the intricate relationships between series using graph structures. Beyond capturing diverse temporal patterns, it also curtails redundant information across series. Notably, the series-aware framework seamlessly integrates with existing Transformer-based models, enriching their ability to comprehend inter-series relationships. Extensive experiments on real-world and synthetic datasets validate the superior performance of SageFormer against contemporary state-of-the-art approaches.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2307.01616</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer Science - Artificial Intelligence ; Computer Science - Learning ; Forecasting ; Graphical representations ; Mathematical models ; Multivariate analysis ; Synthetic data ; Time series</subject><ispartof>arXiv.org, 2024-11</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27925</link.rule.ids><backlink>$$Uhttps://doi.org/10.1109/JIOT.2024.3363451$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.48550/arXiv.2307.01616$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhang, Zhenwei</creatorcontrib><creatorcontrib>Meng, Linghang</creatorcontrib><creatorcontrib>Gu, Yuantao</creatorcontrib><title>SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting</title><title>arXiv.org</title><description>In the burgeoning ecosystem of Internet of Things, multivariate time series (MTS) data has become ubiquitous, highlighting the fundamental role of time series forecasting across numerous applications. The crucial challenge of long-term MTS forecasting requires adept models capable of capturing both intra- and inter-series dependencies. Recent advancements in deep learning, notably Transformers, have shown promise. However, many prevailing methods either marginalize inter-series dependencies or overlook them entirely. To bridge this gap, this paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of such dependencies. At the heart of this framework lies our specific implementation: the SageFormer. As a Series-aware Graph-enhanced Transformer model, SageFormer proficiently discerns and models the intricate relationships between series using graph structures. Beyond capturing diverse temporal patterns, it also curtails redundant information across series. Notably, the series-aware framework seamlessly integrates with existing Transformer-based models, enriching their ability to comprehend inter-series relationships. Extensive experiments on real-world and synthetic datasets validate the superior performance of SageFormer against contemporary state-of-the-art approaches.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Learning</subject><subject>Forecasting</subject><subject>Graphical representations</subject><subject>Mathematical models</subject><subject>Multivariate analysis</subject><subject>Synthetic data</subject><subject>Time series</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotj09PwjAYxhsTEwnyATzZxPNm167t5o0QpyYzHsDz8rK-kCJb8d0A_fZO4PQcnj95fozdJSJOM63FI9CPP8RSCRuLxCTmio2kUkmUpVLesEnXbYQQ0liptRqxzzmssQjUID3xOZLHLpoegZAXBA0eA33xVSBehnYd9UgNf99ve38A8tAjX_gGLzU-rGANXe_b9S27XsG2w8lFx2xRPC9mr1H58fI2m5YR5NpEK7l0MnVKQ23R5GmepqhqB8JkBmurl8qhyWqd2MFR1kqBWoBzRqMwiE6N2f159sRc7cg3QL_VP3t1Yh8SD-fEjsL3Hru-2oQ9tcOnSmZKZULnuVV_XD9cgg</recordid><startdate>20241105</startdate><enddate>20241105</enddate><creator>Zhang, Zhenwei</creator><creator>Meng, Linghang</creator><creator>Gu, Yuantao</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20241105</creationdate><title>SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting</title><author>Zhang, Zhenwei ; Meng, Linghang ; Gu, Yuantao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a956-f2bd24d35ac7e694944e3cda0686ec75b3de68c51794437720e50add65e06eed3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Learning</topic><topic>Forecasting</topic><topic>Graphical representations</topic><topic>Mathematical models</topic><topic>Multivariate analysis</topic><topic>Synthetic data</topic><topic>Time series</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Zhenwei</creatorcontrib><creatorcontrib>Meng, Linghang</creatorcontrib><creatorcontrib>Gu, Yuantao</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Zhenwei</au><au>Meng, Linghang</au><au>Gu, Yuantao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting</atitle><jtitle>arXiv.org</jtitle><date>2024-11-05</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>In the burgeoning ecosystem of Internet of Things, multivariate time series (MTS) data has become ubiquitous, highlighting the fundamental role of time series forecasting across numerous applications. The crucial challenge of long-term MTS forecasting requires adept models capable of capturing both intra- and inter-series dependencies. Recent advancements in deep learning, notably Transformers, have shown promise. However, many prevailing methods either marginalize inter-series dependencies or overlook them entirely. To bridge this gap, this paper introduces a novel series-aware framework, explicitly designed to emphasize the significance of such dependencies. At the heart of this framework lies our specific implementation: the SageFormer. As a Series-aware Graph-enhanced Transformer model, SageFormer proficiently discerns and models the intricate relationships between series using graph structures. Beyond capturing diverse temporal patterns, it also curtails redundant information across series. Notably, the series-aware framework seamlessly integrates with existing Transformer-based models, enriching their ability to comprehend inter-series relationships. Extensive experiments on real-world and synthetic datasets validate the superior performance of SageFormer against contemporary state-of-the-art approaches.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2307.01616</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-11
issn 2331-8422
language eng
recordid cdi_arxiv_primary_2307_01616
source arXiv.org; Free E- Journals
subjects Computer Science - Artificial Intelligence
Computer Science - Learning
Forecasting
Graphical representations
Mathematical models
Multivariate analysis
Synthetic data
Time series
title SageFormer: Series-Aware Framework for Long-term Multivariate Time Series Forecasting
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T14%3A45%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SageFormer:%20Series-Aware%20Framework%20for%20Long-term%20Multivariate%20Time%20Series%20Forecasting&rft.jtitle=arXiv.org&rft.au=Zhang,%20Zhenwei&rft.date=2024-11-05&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2307.01616&rft_dat=%3Cproquest_arxiv%3E2833805997%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2833805997&rft_id=info:pmid/&rfr_iscdi=true