3D Graph Convolutional Feature Selection and Dense Pre-Estimation for Skeleton Action Recognition

Action recognition plays an important role in promoting various applications in healthcare and smart education. However, unclear target actions, similar actions, and occluded characters may be encountered in some special scenarios. To solve the issues, a 3D Graph Convolutional Feature Selection and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.11733-11742
Hauptverfasser: Zhang, Junxian, Yang, Aiping, Miao, Changwu, Li, Xiang, Zhang, Rui, Thanh, Dang N. H.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Action recognition plays an important role in promoting various applications in healthcare and smart education. However, unclear target actions, similar actions, and occluded characters may be encountered in some special scenarios. To solve the issues, a 3D Graph Convolutional Feature Selection and Dense Pre-estimation for Skeleton Action Recognition (3D-GSD) method is proposed to analyze and recognize the motion trajectory of the human skeleton. First, 3DSKNet is designed to adaptively learn and select important features in the skeleton sequence to identify skeleton parts of different importance more accurately according to the size of the input image resolution. It will help to better focus on key skeletal parts, improving the accuracy and robustness of bone recognition. Then, the DensePose algorithm is used to detect the complex key points of the human body posture and optimize the accuracy and interpretability of action recognition for different key points, key channels, and key-frames of the action. The proposed method achieves the best performance on the NTU RGB+D 60, NTU RGB+D 120 datasets, and Kinetics SKeletion 400 datasets, with an improvement of 0.02% 0.06%, and 0.1% in accuracy compared to the state-of-the-art methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3353622