Robust Automatic Modulation Classification in Low Signal to Noise Ratio
In a non-cooperative communication environment, automatic modulation classification (AMC) is an essential technology for analyzing signals and classifying different kinds of signal modulation before they are demodulated. Deep learning (DL)-based AMC has been proposed as an efficient method of achiev...
Gespeichert in:
Veröffentlicht in: | IEEE access 2023, Vol.11, p.7860-7872 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In a non-cooperative communication environment, automatic modulation classification (AMC) is an essential technology for analyzing signals and classifying different kinds of signal modulation before they are demodulated. Deep learning (DL)-based AMC has been proposed as an efficient method of achieving high classification performance. However, most current DL-AMC methods have limited generalization capabilities under varying noise conditions, especially at low signal-to-noise ratios (SNRs). Therefore, these methods can not be directly applied to practical systems. In this paper, we propose a threshold autoencoder denoiser convolutional neural network (TADCNN), which consists of a threshold autoencoder denoiser (TAD) and a convolutional neural network (CNN). TADs reduce noise power and clean input signals, which are then passed on to CNN for classification. The TAD network generally consists of three components: the batch normalization layer, the autoencoder, and the threshold denoise. The threshold denoise component uses an auto-learning threshold sub-network to compute thresholds automatically. According to experiments, AMC with TAD improved classification accuracy by 70% at low SNR compared with a model without a denoiser. Additionally, our model achieves an average accuracy of 66.64% on the RML2016.10A dataset, which is 6% to 18% higher than the current AMC model. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3238995 |