State Transition Graph-Based Spatial-Temporal Attention Network for Cell-Level Mobile Traffic Prediction
Mobile traffic prediction enables the efficient utilization of network resources and enhances user experience. In this paper, we propose a state transition graph-based spatial-temporal attention network (STG-STAN) for cell-level mobile traffic prediction, which is designed to exploit the underlying...
Gespeichert in:
Veröffentlicht in: | Sensors (Basel, Switzerland) Switzerland), 2023-11, Vol.23 (23), p.9308 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Mobile traffic prediction enables the efficient utilization of network resources and enhances user experience. In this paper, we propose a state transition graph-based spatial-temporal attention network (STG-STAN) for cell-level mobile traffic prediction, which is designed to exploit the underlying spatial-temporal dynamic information hidden in the historical mobile traffic data. Specifically, we first identify the semantic context information over different segments of the historical data by constructing the state transition graphs, which may reveal different patterns of random fluctuation. Then, based on the state transition graphs, a spatial attention extraction module using graph convolutional networks (GCNs) is designed to aggregate the spatial information of different nodes in the state transition graph. Moreover, a temporal extraction module is employed to capture the dynamic evolution and temporal correlation of the state transition graphs over time. Such a spatial-temporal attention network can be further integrated with a parallel long short-term memory (LSTM) module to improve the accuracy of mobile traffic prediction. Extensive experiments demonstrate that the STG-STAN can better exploit the spatial-temporal information hidden in the state transition graphs, achieving superior performance compared with several baselines. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s23239308 |