HRPnet: High-Dimensional Feature Mapping for Radar Space Target Recognition

Deep learning has made significant progress in the field of radar space target recognition. However, deep neural networks require significant amounts of data to train network parameters, posing challenges in achieving non-cooperative target recognition. Therefore, it is of practical significance to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2024-04, Vol.24 (7), p.1-1
Hauptverfasser: Dong, Jian, She, Qingqing, Hou, Feifei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep learning has made significant progress in the field of radar space target recognition. However, deep neural networks require significant amounts of data to train network parameters, posing challenges in achieving non-cooperative target recognition. Therefore, it is of practical significance to research a fast and accurate target recognition method with limited radar data. In this paper, we propose a novel radar space target recognition network based on high-dimensional feature maps, called HRPnet, which fully utilizes high-resolution range profile (HRRP), radar cross section (RCS), and polarization (POL) data obtained from the radar. First, a sparse auto encoder (SAE) is employed to conduct deep feature extraction on the three types of data. Second, the Gramian angular field (GAF) transformation is employed to obtain two-dimensional representation of HRRP, RCS, and POL data, respectively. These two-dimensional maps are then integrated to construct high-dimensional feature maps. Third, a feature map convolutional neural networks (FMCNN) is designed for high-dimensional feature map classification and target recognition. Experimental results indicate that the proposed HRPnet outperforms existing methods in terms of recognition accuracy and noise resistance, particularly in the case of limited sample size.
ISSN:1530-437X
1558-1748
DOI:10.1109/JSEN.2024.3361926