TrafficGAN: Network-Scale Deep Traffic Prediction With Generative Adversarial Nets

Traffic flow prediction has received rising research interest recently since it is a key step to prevent and relieve traffic congestion in urban areas. Existing methods mostly focus on road-level or region-level traffic prediction, and fail to deeply capture the high-order spatial-temporal correlati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on intelligent transportation systems 2021-01, Vol.22 (1), p.219-230
Hauptverfasser: Zhang, Yuxuan, Wang, Senzhang, Chen, Bing, Cao, Jiannong, Huang, Zhiqiu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traffic flow prediction has received rising research interest recently since it is a key step to prevent and relieve traffic congestion in urban areas. Existing methods mostly focus on road-level or region-level traffic prediction, and fail to deeply capture the high-order spatial-temporal correlations among the road links to perform a road network-level prediction. In this paper, we propose a network-scale deep traffic prediction model called TrafficGAN, in which Generative Adversarial Nets (GAN) is utilized to predict traffic flows under an adversarial learning framework. To capture the spatial-temporal correlations among the road links of a road network, both Convolutional Neural Nets (CNN) and Long-Short Term Memory (LSTM) models are embedded into TrafficGAN. In addition, we also design a deformable convolution kernel for CNN to make it better handle the input road network data. We extensively evaluate our proposal over two large GPS probe datasets in the arterial road network of downtown Chicago and Bay Area of California. The results show that TrafficGAN significantly outperforms both traditional statistical models and state-of-the-art deep learning models in network-scale short-term traffic flow prediction.
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2019.2955794