Adversarial self-attentive time-variant neural networks for multi-step time series forecasting
Accurate forecasting of time series mitigates the uncertainty of future outlooks and is a great help in reducing errors in decisions. Despite years of researches, there are still some challenges to accurate forecasting of time series, including the difficulty of dynamic modeling, the problem of capt...
Gespeichert in:
Veröffentlicht in: | Expert systems with applications 2023-11, Vol.231, p.120722, Article 120722 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Accurate forecasting of time series mitigates the uncertainty of future outlooks and is a great help in reducing errors in decisions. Despite years of researches, there are still some challenges to accurate forecasting of time series, including the difficulty of dynamic modeling, the problem of capturing short-term correlations, and the conundrum of long-term forecasting. This paper offers an Adversarial Truncated Cauchy Self-Attentive Time-Variant Neural Network (ASATVN) for multi-step ahead time series forecasting. Specifically, the proposed model builds on Generative Adversarial Networks, in which the generator is composed of a novel time-variant model. The time-variant model contributes to learning dynamic time-series changes with its time-variant architecture and employs a newly proposed Truncated Cauchy Self-Attention block to capture the local sequential dependencies better. For the discriminator, two self-attentive discriminators are presented to regularize predictions with fidelity and continuity, which is beneficial to predicting sequence over longer time horizons. Our proposed ASATVN model outperforms the state-of-the-art predictive models on eleven real-world benchmark datasets, demonstrating its effectiveness.
•A time-variant network learns various dynamics across multiple time steps.•A new self-attention are more sensitive to the local context of time series.•Two discriminators regularize predictor to offer realistic and continuous forecasts. |
---|---|
ISSN: | 0957-4174 1873-6793 |
DOI: | 10.1016/j.eswa.2023.120722 |