Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures

The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Smart materials and structures 2020-03, Vol.29 (3), p.35004
Hauptverfasser: Lee, Seulah, Sung, Minchang, Choi, Youngjin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 3
container_start_page 35004
container_title Smart materials and structures
container_volume 29
creator Lee, Seulah
Sung, Minchang
Choi, Youngjin
description The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper, a wearable fabric sensor is proposed to measure the sEMG on the lower limb and to classify the foot postures by using the proposed convolutional neural network (CNN), ultimately, for the application to high-level upper limb amputees. First, we determined that sEMG signals of the lower limb can be classified into levels in a manner similar to those of the upper limb for eight postures. Second, a multilayer perceptron (MLP) and the proposed CNN was used to compare the pattern recognition accuracy for classifying eight postures. Finally, the wearable fabric sensor and the proposed CNN network were demonstrated by the trans-radial amputees. These results showed that the wearable fabric sensor verified different eight patterns based on similar motions of both limbs (p < 0.001). In addition, the classification accuracy (91.3%) of the proposed CNN was much higher than that (79%) of MLP (p < 0.05). The wearable fabric sensor allowed the measurement location to change from the upper limb to the lower limb and allowed the number of the classifiable patterns to increase thanks to the sixteen-channel sEMG signals acquired from 32 fabric electrodes. The high classification accuracy of the proposed CNN will be useful for various users who have to wear myoelectric prosthesis every day.
doi_str_mv 10.1088/1361-665X/ab6690
format Article
fullrecord <record><control><sourceid>iop_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1088_1361_665X_ab6690</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>smsab6690</sourcerecordid><originalsourceid>FETCH-LOGICAL-c370t-a918407b1d7cd239028fc401481d845dfa87a726b8d03ffda6cb0e400cddd92b3</originalsourceid><addsrcrecordid>eNp1kEtLAzEUhYMoWB97l9m5cezNJM1kllJ8QcGNoruQp02ZToZkKvTfm6HiSheXC5dzDud-CF0RuCUgxJxQTirOFx9zpTlv4QjNfk_HaAYtZxVpan6KznLeABAiKJkh_-5UUrpz2CudgsHZ9Tkm7MuY2I8pdl3oP_F2H13nzDhJ1qq3eEgxj2uXQ8ZfQWHTqZyDD0aNIfY4-pIQRzwU0S65fIFOvOqyu_zZ5-jt4f51-VStXh6fl3erytAGxkq1RDBoNLGNsTVtoRbeMCBMECvYwnolGlWe0MIC9d4qbjQ4BmCstW2t6TmCQ64p9XJyXg4pbFXaSwJy4iQnKHKCIg-ciuX6YAlxkJu4S30pKPM2y7qVVAJdADA5WF-UN38o_w3-Bi4leaU</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures</title><source>IOP Publishing Journals</source><source>Institute of Physics (IOP) Journals - HEAL-Link</source><creator>Lee, Seulah ; Sung, Minchang ; Choi, Youngjin</creator><creatorcontrib>Lee, Seulah ; Sung, Minchang ; Choi, Youngjin</creatorcontrib><description>The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper, a wearable fabric sensor is proposed to measure the sEMG on the lower limb and to classify the foot postures by using the proposed convolutional neural network (CNN), ultimately, for the application to high-level upper limb amputees. First, we determined that sEMG signals of the lower limb can be classified into levels in a manner similar to those of the upper limb for eight postures. Second, a multilayer perceptron (MLP) and the proposed CNN was used to compare the pattern recognition accuracy for classifying eight postures. Finally, the wearable fabric sensor and the proposed CNN network were demonstrated by the trans-radial amputees. These results showed that the wearable fabric sensor verified different eight patterns based on similar motions of both limbs (p &lt; 0.001). In addition, the classification accuracy (91.3%) of the proposed CNN was much higher than that (79%) of MLP (p &lt; 0.05). The wearable fabric sensor allowed the measurement location to change from the upper limb to the lower limb and allowed the number of the classifiable patterns to increase thanks to the sixteen-channel sEMG signals acquired from 32 fabric electrodes. The high classification accuracy of the proposed CNN will be useful for various users who have to wear myoelectric prosthesis every day.</description><identifier>ISSN: 0964-1726</identifier><identifier>EISSN: 1361-665X</identifier><identifier>DOI: 10.1088/1361-665X/ab6690</identifier><identifier>CODEN: SMSTER</identifier><language>eng</language><publisher>IOP Publishing</publisher><subject>classification ; fabric sensor ; sEMG (surface electromyogram) ; textile electrode ; wearable device</subject><ispartof>Smart materials and structures, 2020-03, Vol.29 (3), p.35004</ispartof><rights>2020 IOP Publishing Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c370t-a918407b1d7cd239028fc401481d845dfa87a726b8d03ffda6cb0e400cddd92b3</citedby><cites>FETCH-LOGICAL-c370t-a918407b1d7cd239028fc401481d845dfa87a726b8d03ffda6cb0e400cddd92b3</cites><orcidid>0000-0002-5009-9059 ; 0000-0001-9059-9149 ; 0000-0002-7251-9851</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1361-665X/ab6690/pdf$$EPDF$$P50$$Giop$$H</linktopdf><link.rule.ids>314,776,780,27901,27902,53821,53868</link.rule.ids></links><search><creatorcontrib>Lee, Seulah</creatorcontrib><creatorcontrib>Sung, Minchang</creatorcontrib><creatorcontrib>Choi, Youngjin</creatorcontrib><title>Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures</title><title>Smart materials and structures</title><addtitle>SMS</addtitle><addtitle>Smart Mater. Struct</addtitle><description>The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper, a wearable fabric sensor is proposed to measure the sEMG on the lower limb and to classify the foot postures by using the proposed convolutional neural network (CNN), ultimately, for the application to high-level upper limb amputees. First, we determined that sEMG signals of the lower limb can be classified into levels in a manner similar to those of the upper limb for eight postures. Second, a multilayer perceptron (MLP) and the proposed CNN was used to compare the pattern recognition accuracy for classifying eight postures. Finally, the wearable fabric sensor and the proposed CNN network were demonstrated by the trans-radial amputees. These results showed that the wearable fabric sensor verified different eight patterns based on similar motions of both limbs (p &lt; 0.001). In addition, the classification accuracy (91.3%) of the proposed CNN was much higher than that (79%) of MLP (p &lt; 0.05). The wearable fabric sensor allowed the measurement location to change from the upper limb to the lower limb and allowed the number of the classifiable patterns to increase thanks to the sixteen-channel sEMG signals acquired from 32 fabric electrodes. The high classification accuracy of the proposed CNN will be useful for various users who have to wear myoelectric prosthesis every day.</description><subject>classification</subject><subject>fabric sensor</subject><subject>sEMG (surface electromyogram)</subject><subject>textile electrode</subject><subject>wearable device</subject><issn>0964-1726</issn><issn>1361-665X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp1kEtLAzEUhYMoWB97l9m5cezNJM1kllJ8QcGNoruQp02ZToZkKvTfm6HiSheXC5dzDud-CF0RuCUgxJxQTirOFx9zpTlv4QjNfk_HaAYtZxVpan6KznLeABAiKJkh_-5UUrpz2CudgsHZ9Tkm7MuY2I8pdl3oP_F2H13nzDhJ1qq3eEgxj2uXQ8ZfQWHTqZyDD0aNIfY4-pIQRzwU0S65fIFOvOqyu_zZ5-jt4f51-VStXh6fl3erytAGxkq1RDBoNLGNsTVtoRbeMCBMECvYwnolGlWe0MIC9d4qbjQ4BmCstW2t6TmCQ64p9XJyXg4pbFXaSwJy4iQnKHKCIg-ciuX6YAlxkJu4S30pKPM2y7qVVAJdADA5WF-UN38o_w3-Bi4leaU</recordid><startdate>20200301</startdate><enddate>20200301</enddate><creator>Lee, Seulah</creator><creator>Sung, Minchang</creator><creator>Choi, Youngjin</creator><general>IOP Publishing</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-5009-9059</orcidid><orcidid>https://orcid.org/0000-0001-9059-9149</orcidid><orcidid>https://orcid.org/0000-0002-7251-9851</orcidid></search><sort><creationdate>20200301</creationdate><title>Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures</title><author>Lee, Seulah ; Sung, Minchang ; Choi, Youngjin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c370t-a918407b1d7cd239028fc401481d845dfa87a726b8d03ffda6cb0e400cddd92b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>classification</topic><topic>fabric sensor</topic><topic>sEMG (surface electromyogram)</topic><topic>textile electrode</topic><topic>wearable device</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lee, Seulah</creatorcontrib><creatorcontrib>Sung, Minchang</creatorcontrib><creatorcontrib>Choi, Youngjin</creatorcontrib><collection>CrossRef</collection><jtitle>Smart materials and structures</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lee, Seulah</au><au>Sung, Minchang</au><au>Choi, Youngjin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures</atitle><jtitle>Smart materials and structures</jtitle><stitle>SMS</stitle><addtitle>Smart Mater. Struct</addtitle><date>2020-03-01</date><risdate>2020</risdate><volume>29</volume><issue>3</issue><spage>35004</spage><pages>35004-</pages><issn>0964-1726</issn><eissn>1361-665X</eissn><coden>SMSTER</coden><abstract>The degrees-of-freedom of robotic prosthetic hands have recently increased, but high-level amputees such as those with shoulder disarticulation and trans-humeral amputation do not have enough muscular areas on their upper limbs upon which to base surface electromyogram (sEMG) signals. In this paper, a wearable fabric sensor is proposed to measure the sEMG on the lower limb and to classify the foot postures by using the proposed convolutional neural network (CNN), ultimately, for the application to high-level upper limb amputees. First, we determined that sEMG signals of the lower limb can be classified into levels in a manner similar to those of the upper limb for eight postures. Second, a multilayer perceptron (MLP) and the proposed CNN was used to compare the pattern recognition accuracy for classifying eight postures. Finally, the wearable fabric sensor and the proposed CNN network were demonstrated by the trans-radial amputees. These results showed that the wearable fabric sensor verified different eight patterns based on similar motions of both limbs (p &lt; 0.001). In addition, the classification accuracy (91.3%) of the proposed CNN was much higher than that (79%) of MLP (p &lt; 0.05). The wearable fabric sensor allowed the measurement location to change from the upper limb to the lower limb and allowed the number of the classifiable patterns to increase thanks to the sixteen-channel sEMG signals acquired from 32 fabric electrodes. The high classification accuracy of the proposed CNN will be useful for various users who have to wear myoelectric prosthesis every day.</abstract><pub>IOP Publishing</pub><doi>10.1088/1361-665X/ab6690</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0002-5009-9059</orcidid><orcidid>https://orcid.org/0000-0001-9059-9149</orcidid><orcidid>https://orcid.org/0000-0002-7251-9851</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0964-1726
ispartof Smart materials and structures, 2020-03, Vol.29 (3), p.35004
issn 0964-1726
1361-665X
language eng
recordid cdi_crossref_primary_10_1088_1361_665X_ab6690
source IOP Publishing Journals; Institute of Physics (IOP) Journals - HEAL-Link
subjects classification
fabric sensor
sEMG (surface electromyogram)
textile electrode
wearable device
title Wearable fabric sensor for controlling myoelectric hand prosthesis via classification of foot postures
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T12%3A17%3A15IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-iop_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Wearable%20fabric%20sensor%20for%20controlling%20myoelectric%20hand%20prosthesis%20via%20classification%20of%20foot%20postures&rft.jtitle=Smart%20materials%20and%20structures&rft.au=Lee,%20Seulah&rft.date=2020-03-01&rft.volume=29&rft.issue=3&rft.spage=35004&rft.pages=35004-&rft.issn=0964-1726&rft.eissn=1361-665X&rft.coden=SMSTER&rft_id=info:doi/10.1088/1361-665X/ab6690&rft_dat=%3Ciop_cross%3Esmsab6690%3C/iop_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true