Prediction of Queue Dissipation Time for Mixed Traffic Flows With Deep Learning
Queue dissipation has been extensively studied about traffic signalization, work zone operations, and ramp metering. Various methods for estimating the intersection's queue length and dissipation time have been reported in the literature, including the use of car-following models with simulatio...
Gespeichert in:
Veröffentlicht in: | IEEE open journal of intelligent transportation systems 2022, Vol.3, p.267-277 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Queue dissipation has been extensively studied about traffic signalization, work zone operations, and ramp metering. Various methods for estimating the intersection's queue length and dissipation time have been reported in the literature, including the use of car-following models with simulation, vehicle trajectories from GPS, shock-wave theory, statistical estimation from traffic flow patterns, and artificial neural networks (ANN). However, most of such methods cannot account for the impacts of interactions between different vehicle types and their spatial distributions in the queue length on the initial discharge time and the resulting total dissipation duration. As such, this study presents a system, named TrafficTalk, that applies a deep learning-based method to reliably capture the queue characteristics of mixed traffic flows, and produce a robust estimate of the dissipating duration for the design of the optimal signal plan. The proposed TrafficTalk, featuring the effectiveness in transforming video-imaged traffic conditions into vehicle density maps, has proved its performance under extensive field evaluations. For instance, compared with the benchmark model, XGBoost in the literature, it has reduced the MAPE from 25.8% to 10.4%., and from 31.3% to 10.4% if the queue discharging stream comprises motorcycles. |
---|---|
ISSN: | 2687-7813 2687-7813 |
DOI: | 10.1109/OJITS.2022.3162526 |