MVCV-Traffic: multiview road traffic state estimation via cross-view learning

Fine-grained urban traffic data are often incomplete owing to limitations in sensor technology and economic cost. However, data-driven traffic analysis methods in intelligent transportation systems (ITSs) heavily rely on the quality of input data. Thus, accurately estimating missing traffic observat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of geographical information science : IJGIS 2023-10, Vol.37 (10), p.2205-2237
Hauptverfasser: Deng, Min, Chen, Kaiqi, Lei, Kaiyuan, Chen, Yuanfang, Shi, Yan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Fine-grained urban traffic data are often incomplete owing to limitations in sensor technology and economic cost. However, data-driven traffic analysis methods in intelligent transportation systems (ITSs) heavily rely on the quality of input data. Thus, accurately estimating missing traffic observations is an essential data engineering task in ITSs. The complexity of underlying node-wise correlation structures and various missing scenarios presents a significant challenge in achieving high-precision estimation. This study proposes a novel multiview neural network termed MVCV-Traffic, equipped with a cross-view learning mechanism, to improve traffic estimation. The contributions of this model can be summarized into two parts: multiview learning and cross-view fusing. For multiview learning, several specialized neural networks are adopted to fit diverse correlation structures from different views. For cross-view fusing, a new information fusion strategy merges multiview messages at both feature and output levels to enhance the learning of joint correlations. Experiments on two real-world datasets demonstrate that the proposed model significantly outperforms existing traffic speed estimation methods for different types and rates of missing data.
ISSN:1365-8816
1362-3087
1365-8824
DOI:10.1080/13658816.2023.2249968