Enhancing squat movement classification performance with a gated long-short term memory with transformer network model
Bodyweight squat is one of the basic sports training exercises. Automatic classification of aberrant squat movements can guide safe and effective bodyweight squat exercise in sports training. This study presents a novel gated long-short term memory with transformer network (GLTN) model for the class...
Gespeichert in:
Veröffentlicht in: | Sports biomechanics 2024-02, p.1-16 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Bodyweight squat is one of the basic sports training exercises. Automatic classification of aberrant squat movements can guide safe and effective bodyweight squat exercise in sports training. This study presents a novel gated long-short term memory with transformer network (GLTN) model for the classification of bodyweight squat movements. Twenty-two healthy young male participants were involved in an experimental study, where they were instructed to perform bodyweight squat in nine different movement patterns, including one acceptable movement defined according to the National Strength and Conditioning Association and eight aberrant movements. Data were acquired from four customised inertial measurement units placed at the thorax, waist, right thigh, and right shank, with a sampling frequency of 200 Hz. The results show that compared to state-of-art deep learning models, our model enhances squat movement classification performance with 96.34% accuracy, 96.31% precision, 96.45% recall, and 96.32% F-score. The proposed model provides a feasible wearable solution to monitoring aberrant squat movements that can facilitate performance and injury risk assessment during sports training. However, this model should not serve as a one-size-fits-all solution, and coaches and practitioners should consider individual's specific needs and training goals when using it. |
---|---|
ISSN: | 1476-3141 1752-6116 |
DOI: | 10.1080/14763141.2024.2315243 |