Deep Stacked Autoencoder Based Long-Term Spectrum Prediction Using Real-World Data
Spectrum prediction is challenging due to its multi-dimension, complex inherent dependency, and heterogeneity among the spectrum data. In this paper, we first propose a stacked autoencoder (SAE) and bi-directional long short-term memory (Bi-LSTM) based spectrum prediction method (SAEL-SP). Specifica...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on cognitive communications and networking 2023-06, Vol.9 (3), p.1-1 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Spectrum prediction is challenging due to its multi-dimension, complex inherent dependency, and heterogeneity among the spectrum data. In this paper, we first propose a stacked autoencoder (SAE) and bi-directional long short-term memory (Bi-LSTM) based spectrum prediction method (SAEL-SP). Specifically, a SAE is designed to extract the hidden features (semantic coding) of spectrum data in an unsupervised manner. Then, the output of SAE is connected to a predictor (Bi-LSTM), which is used for long-term prediction by learning hidden features. The main advantage of SAEL-SP is that the underlying features of spectrum data can be retained automatically, layer by layer, rather than designing them manually. To further improve the prediction accuracy of SAEL-SP and achieve a wider bandwidth prediction, we propose a SAE-based spectrum prediction method using temporal-spectral-spatial features of data (SAE-TSS). Different from SAEL-SP, the input of SAE-TSS is the image format. SAE-TSS achieves higher prediction accuracy than SAEL-SP using the features extracted from time, frequency, and space dimensions. We use a real-world spectrum dataset to validate the effectiveness of two prediction frameworks. Experiment results show that both SAEL-SP and SAE-TSS outperform existing spectrum prediction approaches. |
---|---|
ISSN: | 2332-7731 2332-7731 |
DOI: | 10.1109/TCCN.2023.3254524 |