Window Attention Convolution Network (WACN): A Local Self-Attention Automatic Modulation Recognition Method

In addressing the limitations of the self-attention mechanism, particularly in handling local features and channel adaptivity for modulation recognition tasks, this paper presents the Window Attention Convolution Network (WACN). The proposed approach partitions the input In-phase and Quadrature (IQ)...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cognitive communications and networking 2024-09, p.1-1
Hauptverfasser: Feng, Yuan, Peng, Kexiao, Wei, Jiaolong, Tang, Zuping
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In addressing the limitations of the self-attention mechanism, particularly in handling local features and channel adaptivity for modulation recognition tasks, this paper presents the Window Attention Convolution Network (WACN). The proposed approach partitions the input In-phase and Quadrature (IQ) signals into multiple local windows. Within these windows, attention computation constructed by deep convolution, deep dilated convolution and pointwise convolution is used to analyze local features and extract important information. Comprehensive evaluations based on four publicly available datasets demonstrate the superiority of WACN in terms of classification accuracy, comparing to other seven state-of-the-art models based on CNN, RNN, Transformer and hybrid networks. It is worth noting that on the HisarMod2019.1 dataset, WACN's recognition accuracy is more than 85% at the lowest SNR, and the overall recognition accuracy is more than 95%. At 0dB, WACN significantly enhances accuracy compared to traditional modulation recognition methods. The implementation details and source code are available at github1.
ISSN:2332-7731
2332-7731
DOI:10.1109/TCCN.2024.3462905