A lightweight and efficient neural network for modulation recognition

This paper proposes a lightweight neural network for automatic modulation recognition, entitled combined, comprising a lightweight backbone network and an optional network. The former network is applied to recognize 24 types of signal modulation and relies on a multi-staged process that involves mul...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Digital signal processing 2022-04, Vol.123, p.103444, Article 103444
Hauptverfasser: Shi, Fengyuan, Yue, Chunsheng, Han, Chao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper proposes a lightweight neural network for automatic modulation recognition, entitled combined, comprising a lightweight backbone network and an optional network. The former network is applied to recognize 24 types of signal modulation and relies on a multi-staged process that involves multiple convolution kernels of different characteristics in the depth direction to generate spliced feature maps. The network parameters are reduced by adjusting the number of convolution kernels and superimposing large and small convolution kernels after adjusting the feature map size by up and downsampling. The additional network effectively increases the recognition accuracy of the amplitude modulation-single side band-suppressed carrier (AM-SSB-SC) and amplitude modulation-single side band-with carrier (AM-SSB-WC), complementing the backbone network's low recognition accuracy on AM-SSB-SC. Trials on the DeepSig dataset demonstrate a recognition accuracy of 98.9% at 16 dB signal-to-noise ratio (SNR) for the combined network and 98.2% at 26 dB SNR for the backbone network, the highest recognition and single-model accuracy reported yet. Experiments of our backbone network on the Hisarmod dataset present a recognition accuracy of 99.8% for five modulation families while affording the least trainable parameters among similar networks.
ISSN:1051-2004
1095-4333
DOI:10.1016/j.dsp.2022.103444