Frequency-hopping transmitter fingerprint feature recognition with kernel projection and joint representation

Frequency-hopping (FH) is one of the commonly used spread spectrum techniques that finds wide applications in communications and radar systems because of its inherent capability of low interception, good confidentiality, and strong anti-interference. However, non-cooperation FH transmitter classific...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Frontiers of information technology & electronic engineering 2019-08, Vol.20 (8), p.1133-1146
Hauptverfasser: Sui, Ping, Guo, Ying, Zang, Kun-feng, Li, Hong-guang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Frequency-hopping (FH) is one of the commonly used spread spectrum techniques that finds wide applications in communications and radar systems because of its inherent capability of low interception, good confidentiality, and strong anti-interference. However, non-cooperation FH transmitter classification is a significant and challenging issue for FH transmitter fingerprint feature recognition, since it not only is sensitive to noise but also has non-linear, non-Gaussian, and non-stability characteristics, which make it difficult to guarantee the classification in the original signal space. Some existing classifiers, such as the sparse representation classifier (SRC), generally use an individual representation rather than all the samples to classify the test data, which over-emphasizes sparsity but ignores the collaborative relationship among the given set of samples. To address these problems, we propose a novel classifier, called the kernel joint representation classifier (KJRC), for FH transmitter fingerprint feature recognition, by integrating kernel projection, collaborative feature representation, and classifier learning into a joint framework. Extensive experiments on real-world FH signals demonstrate the effectiveness of the proposed method in comparison with several state-of-the-art recognition methods.
ISSN:2095-9184
2095-9230
DOI:10.1631/FITEE.1800025