Development of Distributed Control System for Vision-Based Myoelectric Prosthetic Hand

Vision-based myoelectric prosthetic hand uses a camera integrated into its body for object detection and environment understanding, where the results provide necessary information for grasp planning. It is expected that the semi-automatic prosthesis control can be realized with this method. However,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.54542-54549
Hauptverfasser: He, Yunan, Shima, Ryusei, Fukuda, Osamu, Bu, Nan, Yamaguchi, Nobuhiko, Okumura, Hiroshi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Vision-based myoelectric prosthetic hand uses a camera integrated into its body for object detection and environment understanding, where the results provide necessary information for grasp planning. It is expected that the semi-automatic prosthesis control can be realized with this method. However, such a control method usually suffers from heavy computation due to the requirement of real-time image processing to keep up with the arm movements of the user. This paper presents a distributed control system that assigns heavy processing tasks to one or multiple processing nodes through the network, which greatly reduces the computation burdens of the processor embedded in the prosthetic hand. In this control scheme, the embedded system in the prosthetic hand is only used for gathering necessary data for grasp planning, while the processing nodes in the network are responsible for processing and managing the collected data. A test platform is built to verify the proposed control scheme. The test platform streams user electromyography (EMG) signals and images simultaneously to the GPU server. The GPU sever analyzes the received data and generates the corresponding motor commands in real time. A case study that uses a 3-DoF gripper to continuously grasp several objects is performed using this test platform.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2911968