GDFormer: A Graph Diffusing Attention based approach for Traffic Flow Prediction

•Traffic flow prediction.•Transformer based approach.•Integration of diffusion process and attention mechanisms.•Applicable to the other spatial temporal data sets. In this paper, we propose a novel traffic flow prediction approach, called as Graph Diffusing trans-Former (GDFormer). GDFormer is in a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern recognition letters 2022-04, Vol.156, p.126-132
Hauptverfasser: Su, Jie, Jin, Zhongfu, Ren, Jie, Yang, Jiandang, Liu, Yong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:•Traffic flow prediction.•Transformer based approach.•Integration of diffusion process and attention mechanisms.•Applicable to the other spatial temporal data sets. In this paper, we propose a novel traffic flow prediction approach, called as Graph Diffusing trans-Former (GDFormer). GDFormer is in architecture of transformer, which is composed by the encoder sequence and decoder sequence. both of the encoder sequence and decoder sequence in GDFormer are constituted by the novel designed Graph Diffusing Attention (GDA) module and the auxiliaries. The GDA module utilizes the query-key-value attention to learn the diffusion parameters for each diffusion step, and dynamically updates the adjacency transition, which reflects the dynamically changing traffic flow between the traffic monitors. To verify the efficiency of our approach, we conduct a lot of experiments on two real-world data sets. With a comparison between our approach and the benchmarks, we find that our approach has achieved state of the art performance. Ablation experiments are conducted to illustrate the effectiveness of the key components in the model. For ease of reproducibility, the code, the processed real-world data sets and the evaluation results are available at https://github.com/dublinsky/GDFormer.
ISSN:0167-8655
1872-7344
DOI:10.1016/j.patrec.2022.03.005