Enhanced Multi-Channel Feature Synthesis for Hand Gesture Recognition Based on CNN With a Channel and Spatial Attention Mechanism
Millimeter-wave (MMW) radar hand gesture recognition technology is becoming important in many electronic device control applications. Currently, most existing approaches utilize the radical and micro-Doppler features from single-channel MMW radar, which ignores the different importance of the inform...
Gespeichert in:
Veröffentlicht in: | IEEE access 2020, Vol.8, p.144610-144620 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Millimeter-wave (MMW) radar hand gesture recognition technology is becoming important in many electronic device control applications. Currently, most existing approaches utilize the radical and micro-Doppler features from single-channel MMW radar, which ignores the different importance of the information contained in the micro-Doppler feature background or target areas. In this paper, we propose an algorithm for hand gesture recognition jointly using multi-channel signatures. The algorithm blends the information of both micro-Doppler features and instantaneous angles (azimuth and elevation) to accomplish hand gesture recognition performed with the convolutional neural network (CNN). To have a better features fusion and make CNN focus on the most important target signal regions and suppress the unnecessary noise areas, we apply the channel and spatial attention-based feature refinement modules. We also employ gesture movement mechanism-based data augmentation for more effective training to alleviate potential overfitting. Extensive experiments demonstrate the effectiveness and superiorities of the proposed algorithm. This method achieves a correct classification rate of 96.61%, approximately 5% higher than that of the single-channel-based recognition strategy as measured based on MMW radar datasets. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2020.3010063 |