DBAFormer: A Double-Branch Attention Transformer for Long-Term Time Series Forecasting

The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time complexity of...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Human-Centric Intelligent Systems 2023-07, Vol.3 (3), p.263-274
Hauptverfasser: Huang, Ji, Ma, Minbo, Dai, Yongsheng, Hu, Jie, Du, Shengdong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The transformer-based approach excels in long-term series forecasting. These models leverage stacking structures and self-attention mechanisms, enabling them to effectively model dependencies in series data. While some approaches prioritize sparse attention to tackle the quadratic time complexity of self-attention, it can limit information utilization. We introduce a creative double-branch attention mechanism that simultaneously captures intricate dependencies in both temporal and variable perspectives. Moreover, we propose query-independent attention, taking into account the near-identical attention allocated by self-attention to different query positions. This enhances efficiency and reduces the impact of redundant information. We integrate the double-branch query-independent attention into popular transformer-based methods like Informer, Autoformer, and Non-stationary transformer. The results obtained from conducting experiments on six practical benchmarks consistently validate that our novel attention mechanism substantially improves the long-term series forecasting performance in contrast to the baseline approach.
ISSN:2667-1336
2667-1336
DOI:10.1007/s44230-023-00037-z