The use of artificial neural networks to predict the muscle behavior

The aim of this article is to introduce methods of prediction of muscle behavior of the lower extremities based on artificial neural networks, which can be used for medical purposes. Our work focuses on predicting muscletendon forces and moments during human gait with the use of angle-time diagram....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Central European Journal of Engineering 2013-09, Vol.3 (3), p.410-418
Hauptverfasser: Kutilek, Patrik, Viteckova, Slavka, Svoboda, Zdenĕk, Smrcka, Pavel
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The aim of this article is to introduce methods of prediction of muscle behavior of the lower extremities based on artificial neural networks, which can be used for medical purposes. Our work focuses on predicting muscletendon forces and moments during human gait with the use of angle-time diagram. A group of healthy children and children with cerebral palsy were measured using a Vicon MoCap system. The kinematic data was recorded and the OpenSim software system was used to identify the joint angles, muscle-tendon forces and joint muscle moment, which are presented graphically with time diagrams. The musculus gastrocnemius medialis that is often studied in the context of cerebral palsy have been chosen to study the method of prediction. The diagrams of mean muscle-tendon force and mean moment are plotted and the data about the force-time and moment-time dependencies are used for training neural networks. The new way of prediction of muscle-tendon forces and moments based on neural networks was tested. Neural networks predicted the muscle forces and moments of healthy children and children with cerebral palsy. The designed method of prediction by neural networks could help to identify the difference between muscle behavior of healthy subjects and diseased subjects.
ISSN:1896-1541
2391-5439
2081-9927
2391-5439
DOI:10.2478/s13531-012-0067-4