Forecasting Pedestrian Movements Using Recurrent Neural Networks: An Application of Crowd Monitoring Data
Currently, effective crowd management based on the information provided by crowd monitoring systems is difficult as this information comes in at the moment adverse crowd movements are already occurring. Up to this moment, very little forecasting techniques have been developed that predict crowd flow...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2019-01, Vol.19 (2), p.382 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Currently, effective crowd management based on the information provided by crowd monitoring systems is difficult as this information comes in at the moment adverse crowd movements are already occurring. Up to this moment, very little forecasting techniques have been developed that predict crowd flows a longer time period ahead. Moreover, most contemporary state estimation methods apply demanding pre-processing steps, such as map-matching. The objective of this paper is to design, train and benchmark a data-driven procedure to forecast crowd movements, which can in real-time predict crowd movement. This procedure entails two steps. The first step comprises of a cell sequence derivation method that allows the representation of spatially continuous GPS traces in terms of discrete cell sequences. The second step entails the training of a Recursive Neural Network (RNN) with a Gated Recurrent Unit (GRU) and six benchmark models to forecast the next location of pedestrians. The RNN-GRU is found to outperform the other tested models. Some additional tests of the ability of the RNN-GRU to forecast illustrate that the RNN-GRU preserves its predictive power when a limited amount of data is used from the first few hours of a multi-day event and temporal information is incorporated in the cell sequences. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s19020382 |