Viewpoint-Agnostic Taekwondo Action Recognition Using Synthesized Two-Dimensional Skeletal Datasets

Issues of fairness and consistency in Taekwondo poomsae evaluation have often occurred due to the lack of an objective evaluation method. This study proposes a three-dimensional (3D) convolutional neural network–based action recognition model for an objective evaluation of Taekwondo poomsae. The mod...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Switzerland), 2023-09, Vol.23 (19), p.8049
Hauptverfasser: Luo, Chenglong, Kim, Sung-Woo, Park, Hun-Young, Lim, Kiwon, Jung, Hoeryong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Issues of fairness and consistency in Taekwondo poomsae evaluation have often occurred due to the lack of an objective evaluation method. This study proposes a three-dimensional (3D) convolutional neural network–based action recognition model for an objective evaluation of Taekwondo poomsae. The model exhibits robust recognition performance regardless of variations in the viewpoints by reducing the discrepancy between the training and test images. It uses 3D skeletons of poomsae unit actions collected using a full-body motion-capture suit to generate synthesized two-dimensional (2D) skeletons from desired viewpoints. The 2D skeletons obtained from diverse viewpoints form the training dataset, on which the model is trained to ensure consistent recognition performance regardless of the viewpoint. The performance of the model was evaluated against various test datasets, including projected 2D skeletons and RGB images captured from diverse viewpoints. Comparison of the performance of the proposed model with those of previously reported action recognition models demonstrated the superiority of the proposed model, underscoring its effectiveness in recognizing and classifying Taekwondo poomsae actions.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23198049