An integrated framework for accurate trajectory prediction based on deep learning

Trajectory prediction for moving objects is a critical task for intelligent transportation with numerous applications, such as route planning, traffic management, congestion alleviation, etc. In this paper, we propose a novel framework that integrates sequence modeling, trajectory clustering and top...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2024-10, Vol.54 (20), p.10161-10175
Hauptverfasser: Zhao, Shuo, Li, Zhaozhi, Zhu, Zikun, Chang, Charles, Li, Xin, Chen, Ying-Chi, Yang, Bo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Trajectory prediction for moving objects is a critical task for intelligent transportation with numerous applications, such as route planning, traffic management, congestion alleviation, etc. In this paper, we propose a novel framework that integrates sequence modeling, trajectory clustering and topology extraction to improve the accuracy of trajectory prediction. By incorporating self-attention for sequence modeling, we are able to effectively capture the temporal dependencies in trajectory data. Additionally, by taking into account the clustering information via a variational auto-encoder and the topological information based on a graphical neural network (GNN), we can further improve the accuracy of trajectory prediction. Furthermore, integrating a GNN facilitates our framework to handle diverse characteristics of road networks, such as road distance and traffic status, thereby making the proposed approach adaptive to different practical scenarios. As demonstrated by the experimental results on two publicly available datasets, our proposed method improves the accuracies by up to 0.5% and 3.8% for 1-step and 15-step predictions respectively, compared to the state-of-the-art method.
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-024-05724-3