Footprint Extraction and Sports Training Action Recognition Based on Wireless Network Communication

The combination of scientific and technological achievements and sports has found new opportunities to change people’s sports habits. Sport training takes up an increasing proportion of people’s lives. In order to improve the efficiency of sports training and standardize the training actions of play...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Wireless communications and mobile computing 2022-01, Vol.2022, p.1-12
1. Verfasser: Jiang, Lu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The combination of scientific and technological achievements and sports has found new opportunities to change people’s sports habits. Sport training takes up an increasing proportion of people’s lives. In order to improve the efficiency of sports training and standardize the training actions of players, this article is based on wireless network communication and uses different types of recognition methods in the field of action recognition to build basic classifications. Iterative mutual training to improve generalization performance can reduce the cost of labeling and realize the complementary advantages of different recognition methods, thereby improving the recognition accuracy of human actions. Finally, the algorithm is used to recognize human movements. This method can effectively overcome the problem of differential degradation of base classifiers in the iterative process of collaborative training and further improve the accuracy of human action recognition. The experimental results prove that the motion recognition of wireless network communication proposed in this paper can effectively improve the accuracy of athletes’ movements, which is more than 20% higher than traditional methods, and, under the guidance of standardized movements, can reduce athletes’ sports injuries.
ISSN:1530-8669
1530-8677
DOI:10.1155/2022/9506418