Real-Time Vehicle Trajectory Prediction for Traffic Conflict Detection at Unsignalized Intersections
Real-time prediction of vehicle trajectory at unsignalized intersections is important for real-time traffic conflict detection and early warning to improve traffic safety at unsignalized intersections. In this study, we propose a robust real-time prediction method for turning movements and vehicle t...
Gespeichert in:
Veröffentlicht in: | Journal of advanced transportation 2021-12, Vol.2021, p.1-15 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Real-time prediction of vehicle trajectory at unsignalized intersections is important for real-time traffic conflict detection and early warning to improve traffic safety at unsignalized intersections. In this study, we propose a robust real-time prediction method for turning movements and vehicle trajectories using deep neural networks. Firstly, a vision-based vehicle trajectory extraction system is developed to collect vehicle trajectories and their left-turn, go straight, and right-turn labels to train turning recognition models and multilayer LSTM deep neural networks for the prediction task. Then, when performing vehicle trajectory prediction, we propose the vehicle heading angle change trend method to recognize the future move of the target vehicle to turn left, go straight, and turn right based on the trajectory data characteristics of the target vehicle before passing the stop line. Finally, we use the trained multilayer LSTM models of turning left, going straight, and turning right to predict the trajectory of the target vehicle through the intersection. Based on the TensorFlow-GPU platform, we use Yolov5-DeepSort to automatically extract vehicle trajectory data at unsignalized intersections. The experimental results show that the proposed method performs well and has a good performance in both speed and accuracy evaluation. |
---|---|
ISSN: | 0197-6729 2042-3195 |
DOI: | 10.1155/2021/8453726 |