Recurrent Neural Network for Estimating Speed Using Probe Vehicle Data in an Urban Area

Urban traffic networks comprise a combination of various links. These networks are complicated as they have numerous intersections, meaning that using an analytical approach or parametric models to estimate driving speeds on arterial or highway roads results in low accuracy. In this study, a model i...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Transportation research record 2022-01, Vol.2676 (1), p.518-531
Hauptverfasser: Yang, Jae Hwan, Kim, Dong-Kyu, Kho, Seung-Young
Format: Artikel
Sprache:eng
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Urban traffic networks comprise a combination of various links. These networks are complicated as they have numerous intersections, meaning that using an analytical approach or parametric models to estimate driving speeds on arterial or highway roads results in low accuracy. In this study, a model is developed to estimate the link speed using speed data collected by a probe vehicle driven across different urban traffic links, which have interrupted flows. We discover multimodal distributions of travel speeds in each link’s probe vehicle data and use them to separate the vehicle groups and calculate the mean speed of links. This strategy makes it possible to obtain more detailed data, which are used to determine the traffic state and increase the accuracy of the model. The developed nonlinear model, suitable for low correlations of consecutive links’ speed data, is built on a recurrent neural network. Moreover, this study merges three machine-learning techniques to apply low correlations between link properties and speed states. The model developed lowered the mean absolute error by 35.9% on average when compared with the long short-term memory with raw data: 46.8% for the slow state, 55.7% for the state change, and 48.0% for the sudden change over 10 km/h.
ISSN:0361-1981
2169-4052
DOI:10.1177/03611981211036371