Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm

With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Met...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-09
Hauptverfasser: Bimbraw, Keshav, Christopher Julius Nycz, Schueler, Matt, Zhang, Ziming, Zhang, Haichong K
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Bimbraw, Keshav
Christopher Julius Nycz
Schueler, Matt
Zhang, Ziming
Zhang, Haichong K
description With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Metacarpophalangeal (MCP) joint angle detection are important for a comprehensive reconstruction of the hand motion. Surface electromyography and other technologies have been used for the detection of hand motions. Ultrasound images of the forearm offer a way to visualize the internal physiology of the hand from a musculoskeletal perspective. Recent work has shown that these images can be classified using machine learning to predict various hand configurations. In this paper, we propose a Convolutional Neural Network (CNN) based deep learning pipeline for predicting the MCP joint angles. We supplement our results by using a Support Vector Classifier (SVC) to classify the ultrasound information into several predefined hand configurations based on activities of daily living (ADL). Ultrasound data from the forearm was obtained from 6 subjects who were instructed to move their hands according to predefined hand configurations relevant to ADLs. Motion capture data was acquired as the ground truth for hand movements at different speeds (0.5 Hz, 1 Hz, & 2 Hz) for the index, middle, ring, and pinky fingers. We were able to get promising SVC classification results on a subset of our collected data set. We demonstrated a correspondence between the predicted MCP joint angles and the actual MCP joint angles for the fingers, with an average root mean square error of 7.35 degrees. We implemented a low latency (6.25 - 9.1 Hz) pipeline for the prediction of both MCP joint angles and hand configuration estimation aimed at real-time control of digital devices, AR/VR interfaces, and physical robots.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2576125242</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2576125242</sourcerecordid><originalsourceid>FETCH-proquest_journals_25761252423</originalsourceid><addsrcrecordid>eNqNi8EKwjAQRIMgKOo_BDwLdduqd1H0IHjQs6zttqbEbM2mX-CPG0XvnoZ586anhpCm89kqAxioiUiTJAkslpDn6VA9j55KUwTDTnOlDxSwQN9ye0OLria0umHjgo7FksQo9dqiiKlMgb_b7o0LdpWpO_-hoq8oVOq4n23wKNxFZX_H2rj6fQk30lv2hP4-Vv0KrdDkmyM13W5O692s9fzoSMKl4c67OF0gXy7mkEMG6X_WC4upUkI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2576125242</pqid></control><display><type>article</type><title>Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm</title><source>Free E- Journals</source><creator>Bimbraw, Keshav ; Christopher Julius Nycz ; Schueler, Matt ; Zhang, Ziming ; Zhang, Haichong K</creator><creatorcontrib>Bimbraw, Keshav ; Christopher Julius Nycz ; Schueler, Matt ; Zhang, Ziming ; Zhang, Haichong K</creatorcontrib><description>With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Metacarpophalangeal (MCP) joint angle detection are important for a comprehensive reconstruction of the hand motion. Surface electromyography and other technologies have been used for the detection of hand motions. Ultrasound images of the forearm offer a way to visualize the internal physiology of the hand from a musculoskeletal perspective. Recent work has shown that these images can be classified using machine learning to predict various hand configurations. In this paper, we propose a Convolutional Neural Network (CNN) based deep learning pipeline for predicting the MCP joint angles. We supplement our results by using a Support Vector Classifier (SVC) to classify the ultrasound information into several predefined hand configurations based on activities of daily living (ADL). Ultrasound data from the forearm was obtained from 6 subjects who were instructed to move their hands according to predefined hand configurations relevant to ADLs. Motion capture data was acquired as the ground truth for hand movements at different speeds (0.5 Hz, 1 Hz, &amp; 2 Hz) for the index, middle, ring, and pinky fingers. We were able to get promising SVC classification results on a subset of our collected data set. We demonstrated a correspondence between the predicted MCP joint angles and the actual MCP joint angles for the fingers, with an average root mean square error of 7.35 degrees. We implemented a low latency (6.25 - 9.1 Hz) pipeline for the prediction of both MCP joint angles and hand configuration estimation aimed at real-time control of digital devices, AR/VR interfaces, and physical robots.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Activities of daily living ; Artificial neural networks ; Augmented reality ; Classification ; Configurations ; Data acquisition ; Data collection ; Deep learning ; Digital systems ; Forearm ; Image reconstruction ; Joints (anatomy) ; Machine learning ; Motion capture ; Robotics ; Ultrasonic imaging</subject><ispartof>arXiv.org, 2021-09</ispartof><rights>2021. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Bimbraw, Keshav</creatorcontrib><creatorcontrib>Christopher Julius Nycz</creatorcontrib><creatorcontrib>Schueler, Matt</creatorcontrib><creatorcontrib>Zhang, Ziming</creatorcontrib><creatorcontrib>Zhang, Haichong K</creatorcontrib><title>Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm</title><title>arXiv.org</title><description>With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Metacarpophalangeal (MCP) joint angle detection are important for a comprehensive reconstruction of the hand motion. Surface electromyography and other technologies have been used for the detection of hand motions. Ultrasound images of the forearm offer a way to visualize the internal physiology of the hand from a musculoskeletal perspective. Recent work has shown that these images can be classified using machine learning to predict various hand configurations. In this paper, we propose a Convolutional Neural Network (CNN) based deep learning pipeline for predicting the MCP joint angles. We supplement our results by using a Support Vector Classifier (SVC) to classify the ultrasound information into several predefined hand configurations based on activities of daily living (ADL). Ultrasound data from the forearm was obtained from 6 subjects who were instructed to move their hands according to predefined hand configurations relevant to ADLs. Motion capture data was acquired as the ground truth for hand movements at different speeds (0.5 Hz, 1 Hz, &amp; 2 Hz) for the index, middle, ring, and pinky fingers. We were able to get promising SVC classification results on a subset of our collected data set. We demonstrated a correspondence between the predicted MCP joint angles and the actual MCP joint angles for the fingers, with an average root mean square error of 7.35 degrees. We implemented a low latency (6.25 - 9.1 Hz) pipeline for the prediction of both MCP joint angles and hand configuration estimation aimed at real-time control of digital devices, AR/VR interfaces, and physical robots.</description><subject>Activities of daily living</subject><subject>Artificial neural networks</subject><subject>Augmented reality</subject><subject>Classification</subject><subject>Configurations</subject><subject>Data acquisition</subject><subject>Data collection</subject><subject>Deep learning</subject><subject>Digital systems</subject><subject>Forearm</subject><subject>Image reconstruction</subject><subject>Joints (anatomy)</subject><subject>Machine learning</subject><subject>Motion capture</subject><subject>Robotics</subject><subject>Ultrasonic imaging</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNi8EKwjAQRIMgKOo_BDwLdduqd1H0IHjQs6zttqbEbM2mX-CPG0XvnoZ586anhpCm89kqAxioiUiTJAkslpDn6VA9j55KUwTDTnOlDxSwQN9ye0OLria0umHjgo7FksQo9dqiiKlMgb_b7o0LdpWpO_-hoq8oVOq4n23wKNxFZX_H2rj6fQk30lv2hP4-Vv0KrdDkmyM13W5O692s9fzoSMKl4c67OF0gXy7mkEMG6X_WC4upUkI</recordid><startdate>20210923</startdate><enddate>20210923</enddate><creator>Bimbraw, Keshav</creator><creator>Christopher Julius Nycz</creator><creator>Schueler, Matt</creator><creator>Zhang, Ziming</creator><creator>Zhang, Haichong K</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210923</creationdate><title>Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm</title><author>Bimbraw, Keshav ; Christopher Julius Nycz ; Schueler, Matt ; Zhang, Ziming ; Zhang, Haichong K</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25761252423</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Activities of daily living</topic><topic>Artificial neural networks</topic><topic>Augmented reality</topic><topic>Classification</topic><topic>Configurations</topic><topic>Data acquisition</topic><topic>Data collection</topic><topic>Deep learning</topic><topic>Digital systems</topic><topic>Forearm</topic><topic>Image reconstruction</topic><topic>Joints (anatomy)</topic><topic>Machine learning</topic><topic>Motion capture</topic><topic>Robotics</topic><topic>Ultrasonic imaging</topic><toplevel>online_resources</toplevel><creatorcontrib>Bimbraw, Keshav</creatorcontrib><creatorcontrib>Christopher Julius Nycz</creatorcontrib><creatorcontrib>Schueler, Matt</creatorcontrib><creatorcontrib>Zhang, Ziming</creatorcontrib><creatorcontrib>Zhang, Haichong K</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bimbraw, Keshav</au><au>Christopher Julius Nycz</au><au>Schueler, Matt</au><au>Zhang, Ziming</au><au>Zhang, Haichong K</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm</atitle><jtitle>arXiv.org</jtitle><date>2021-09-23</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>With the advancement in computing and robotics, it is necessary to develop fluent and intuitive methods for interacting with digital systems, AR/VR interfaces, and physical robotic systems. Hand movement recognition is widely used to enable this interaction. Hand configuration classification and Metacarpophalangeal (MCP) joint angle detection are important for a comprehensive reconstruction of the hand motion. Surface electromyography and other technologies have been used for the detection of hand motions. Ultrasound images of the forearm offer a way to visualize the internal physiology of the hand from a musculoskeletal perspective. Recent work has shown that these images can be classified using machine learning to predict various hand configurations. In this paper, we propose a Convolutional Neural Network (CNN) based deep learning pipeline for predicting the MCP joint angles. We supplement our results by using a Support Vector Classifier (SVC) to classify the ultrasound information into several predefined hand configurations based on activities of daily living (ADL). Ultrasound data from the forearm was obtained from 6 subjects who were instructed to move their hands according to predefined hand configurations relevant to ADLs. Motion capture data was acquired as the ground truth for hand movements at different speeds (0.5 Hz, 1 Hz, &amp; 2 Hz) for the index, middle, ring, and pinky fingers. We were able to get promising SVC classification results on a subset of our collected data set. We demonstrated a correspondence between the predicted MCP joint angles and the actual MCP joint angles for the fingers, with an average root mean square error of 7.35 degrees. We implemented a low latency (6.25 - 9.1 Hz) pipeline for the prediction of both MCP joint angles and hand configuration estimation aimed at real-time control of digital devices, AR/VR interfaces, and physical robots.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2021-09
issn 2331-8422
language eng
recordid cdi_proquest_journals_2576125242
source Free E- Journals
subjects Activities of daily living
Artificial neural networks
Augmented reality
Classification
Configurations
Data acquisition
Data collection
Deep learning
Digital systems
Forearm
Image reconstruction
Joints (anatomy)
Machine learning
Motion capture
Robotics
Ultrasonic imaging
title Prediction of Metacarpophalangeal joint angles and Classification of Hand configurations based on Ultrasound Imaging of the Forearm
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T15%3A36%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Prediction%20of%20Metacarpophalangeal%20joint%20angles%20and%20Classification%20of%20Hand%20configurations%20based%20on%20Ultrasound%20Imaging%20of%20the%20Forearm&rft.jtitle=arXiv.org&rft.au=Bimbraw,%20Keshav&rft.date=2021-09-23&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2576125242%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2576125242&rft_id=info:pmid/&rfr_iscdi=true