The Prediction of Multistep Traffic Flow Based on AST-GCN-LSTM

Aiming at the traffic flow prediction problem of the traffic network, this paper proposes a multistep traffic flow prediction model based on attention-based spatial-temporal-graph neural network-long short-term memory neural network (AST-GCN-LSTM). The model can capture the complex spatial dependenc...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of advanced transportation 2021-12, Vol.2021, p.1-10
Hauptverfasser: Hou, Fan, Zhang, Yue, Fu, Xinli, Jiao, Lele, Zheng, Wen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Aiming at the traffic flow prediction problem of the traffic network, this paper proposes a multistep traffic flow prediction model based on attention-based spatial-temporal-graph neural network-long short-term memory neural network (AST-GCN-LSTM). The model can capture the complex spatial dependence of road nodes on the road network and use LSGC (local spectrogram convolution) to capture spatial correlation features from the K-order local neighbors of the road segment nodes in the road network. It is more accurate to extract the information of neighbor nodes by replacing the single-hop neighborhood matrix with K-order local neighborhoods to expand the receptive field of graph convolution. The high-order neighborhood of road nodes is also fully considered instead of only extracting features from first-order neighbor nodes. In addition, an external attribute enhancement unit is designed to extract external factors (weather, point of interest, time, etc.) that affect traffic flow in order to improve the accuracy of the model’s traffic flow prediction. The experimental results show that when considering the static, dynamic, and static and dynamic combination, the model has excellent performance: RMSE (4.0406, 4.0362, 4.0234), MAE (2.7184, 2.7044, 2.7030), accuracy (0.7132, 0.7190, 0.7223).
ISSN:0197-6729
2042-3195
DOI:10.1155/2021/9513170