CEM: A Convolutional Embedding Model for Predicting Next Locations
The widespread use of positioning devices and cameras has given rise to a deluge of trajectory data (e.g., vehicle passage records and check-in data), offering great opportunities for location prediction. One problem that has received much attention recently is predicting next locations for an objec...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on intelligent transportation systems 2021-06, Vol.22 (6), p.3349-3358 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The widespread use of positioning devices and cameras has given rise to a deluge of trajectory data (e.g., vehicle passage records and check-in data), offering great opportunities for location prediction. One problem that has received much attention recently is predicting next locations for an object given previous locations. Several location prediction methods based on embedding learning have been proposed to tackle this issue. They usually focus on check-in trajectories and model sequential locations using an average of the embedding vectors. In this paper, we have proposed a Convolutional Embedding Model (CEM) to predict next locations using traffic trajectory data, via modeling the relative ordering of locations with a one-dimensional convolution. CEM is further augmented by considering constraints posed by road networks in the traffic trajectory data, learning a double-prototype representation for each location to eliminate the incorrect location transitions as well as modeling the combination of factors (such as sequential, personal, and temporal) that affect the human mobility patterns, and thus offers a more accurate prediction than just accounting for sequential patterns. Experimental results on two real-world trajectory datasets show that CEM is effective and outperforms the state-of-the-art methods. |
---|---|
ISSN: | 1524-9050 1558-0016 |
DOI: | 10.1109/TITS.2020.2983647 |