Multitask Hypergraph Convolutional Networks: A Heterogeneous Traffic Prediction Framework

Traffic prediction methods on a single-source data have achieved excellent results in recent years, especially the Graph Convolutional Networks (GCN) based models with spatio-temporal dependency. In reality, various modes of urban transportation operate simultaneously. They influence and complement...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on intelligent transportation systems 2022-10, Vol.23 (10), p.18557-18567
Hauptverfasser: Wang, Jingcheng, Zhang, Yong, Wang, Lixun, Hu, Yongli, Piao, Xinglin, Yin, Baocai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Traffic prediction methods on a single-source data have achieved excellent results in recent years, especially the Graph Convolutional Networks (GCN) based models with spatio-temporal dependency. In reality, various modes of urban transportation operate simultaneously. They influence and complement each other in common space-time occasions, constituting the transportation system dynamically. Thus, traffic data from multiple sources is ostensibly heterogeneous, but internally correlated. The typical single data driven models are, however, not universally applicable for heterogeneous traffic data. To address this issue, we propose a Multi-task Hypergraph Convolutional Neural Network (MT-HGCN) for the multi-source traffic prediction problem. The framework consists of a main task and a related task. Both tasks are based on Hypergraph Convolutional Neural Networks (HGCN) and are devoted to two prediction problems. Furthermore, the tasks are bridged by a feature compress unit, which models the correlation and shares the latent feature to improve the performance of the main task. The node-level forecasting has been evaluated on historical datasets of Beijing to verify the effectiveness of the proposed method. Compared with the state-of-the-arts, the superior performance of the proposed method can be obtained.
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2022.3168879