TRec: an efficient recommendation system for hunting passengers with deep neural networks
Discovering hidden knowledge patterns in trajectory data can help to hunt passengers for taxi drivers. And it is an important issue in the intelligent transportation domain. However, the existing approaches are inaccurate in real applications. Hence in this paper, by using the GPS trajectory big dat...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2019-01, Vol.31 (Suppl 1), p.209-222 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Discovering hidden knowledge patterns in trajectory data can help to hunt passengers for taxi drivers. And it is an important issue in the intelligent transportation domain. However, the existing approaches are inaccurate in real applications. Hence in this paper, by using the GPS trajectory big data of taxis, we innovatively present an efficient and effective recommendation system (TRec) for hunting passengers with deep neural structures. This proposed recommendation system is mainly based on the wide & deep model, which is trained wide linear frameworks and deep neural networks together and can simultaneously have the benefits of memorization and generalization to hunt passengers. Meanwhile, in order to improve the accuracy of hunt passengers, our proposed recommendation system uses experienced taxi drivers as learning objects, while considering the prediction of hunting passengers, the prediction of road condition and the evaluation of earnings simultaneously. A performance study using the real GPS trajectory dataset is conducted to evaluate our proposed recommendation system. The experimental evaluation shows that the proposed recommendation system is both efficient and effective. This work strides forward a first step toward building a recommendation system for hunting passengers based on the wide & deep model. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-018-3728-2 |