DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control

In EMG based pattern recognition (EMG-PR), deep learning-based techniques have become more prominent for their self-regulating capability to extract discriminant features from large data-sets. Moreover, the performance of traditional machine learning-based methods show limitation to categorize over...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on medical robotics and bionics 2022-11, Vol.4 (4), p.991-999
Hauptverfasser: Pancholi, Sidharth, Joshi, Amit M., Joshi, Deepak
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 999
container_issue 4
container_start_page 991
container_title IEEE transactions on medical robotics and bionics
container_volume 4
creator Pancholi, Sidharth
Joshi, Amit M.
Joshi, Deepak
description In EMG based pattern recognition (EMG-PR), deep learning-based techniques have become more prominent for their self-regulating capability to extract discriminant features from large data-sets. Moreover, the performance of traditional machine learning-based methods show limitation to categorize over a certain number of classes and degrades over a period of time. In this paper, an accurate, robust, and fast convolutional neural network-based framework for EMG pattern identification is presented. To assess the performance of the proposed system, five publicly available and benchmark data-sets of upper limb activities were used. This data-set contains 49 to 52 upper limb motions (NinaPro DB1, NinaPro DB2, and NinaPro DB3), Data with force variation, and data with arm position variation for intact and amputated subjects. The classification accuracies of 92.18% (53 classes), 91.56% (49 classes), 84.98% (49 classes of amputees), 95.67% (6 classes with force variation), and 99.12% (8 classes with arm position variation) have been observed during the testing. The performance of the proposed system is compared with the state of art techniques in the literature. The findings demonstrate that classification accuracy and time complexity have been improved significantly. For signal pre-processing and deep learning techniques, Keras which is a high-level API for TensorFlow is utilised to build profound learning models. The proposed method has been tested on the Intel Core i7 3.5GHz, 7th Gen CPU with 8GB DDR4 RAM.
doi_str_mv 10.1109/TMRB.2022.3216957
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2737569317</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9928383</ieee_id><sourcerecordid>2737569317</sourcerecordid><originalsourceid>FETCH-LOGICAL-c223t-78797cc2710de534eb6e221513cb8426b644e2437e32bbc72f790a7f5cf205f83</originalsourceid><addsrcrecordid>eNpNkF9LwzAUxYsoOOY-gPgS8LkzuWmb1jf3TwcdjjHxsaTZ7da5JjPphH17MzbEp3u4nHPv4RcE94z2GaPZ03K2GPSBAvQ5sCSLxVXQgVgkIffL63_6Nug5t6WUAoup4EknaEb5fPFMRoh7kqO0utbrcCAdrshYb6RWXsxl26LVZIHKrHXd1kaTiZUNhp_GfpHKWDJt9tb8eO_saHCHqrW1InNrXLtBVzsyNLq1ZncX3FRy57B3md3gYzJeDt_C_P11OnzJQwXA21CkIhNKgWB0hTGPsEwQfGXGVZlGkJRJFCFEXCCHslQCKpFRKapYVUDjKuXd4PF817f6PqBri605WO1fFiC4iJOMM-Fd7OxSvqizWBV7WzfSHgtGixPY4gS2OIEtLmB95uGcqRHxz59lkPKU8198UHOe</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2737569317</pqid></control><display><type>article</type><title>DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control</title><source>IEEE Electronic Library (IEL)</source><creator>Pancholi, Sidharth ; Joshi, Amit M. ; Joshi, Deepak</creator><creatorcontrib>Pancholi, Sidharth ; Joshi, Amit M. ; Joshi, Deepak</creatorcontrib><description>In EMG based pattern recognition (EMG-PR), deep learning-based techniques have become more prominent for their self-regulating capability to extract discriminant features from large data-sets. Moreover, the performance of traditional machine learning-based methods show limitation to categorize over a certain number of classes and degrades over a period of time. In this paper, an accurate, robust, and fast convolutional neural network-based framework for EMG pattern identification is presented. To assess the performance of the proposed system, five publicly available and benchmark data-sets of upper limb activities were used. This data-set contains 49 to 52 upper limb motions (NinaPro DB1, NinaPro DB2, and NinaPro DB3), Data with force variation, and data with arm position variation for intact and amputated subjects. The classification accuracies of 92.18% (53 classes), 91.56% (49 classes), 84.98% (49 classes of amputees), 95.67% (6 classes with force variation), and 99.12% (8 classes with arm position variation) have been observed during the testing. The performance of the proposed system is compared with the state of art techniques in the literature. The findings demonstrate that classification accuracy and time complexity have been improved significantly. For signal pre-processing and deep learning techniques, Keras which is a high-level API for TensorFlow is utilised to build profound learning models. The proposed method has been tested on the Intel Core i7 3.5GHz, 7th Gen CPU with 8GB DDR4 RAM.</description><identifier>ISSN: 2576-3202</identifier><identifier>EISSN: 2576-3202</identifier><identifier>DOI: 10.1109/TMRB.2022.3216957</identifier><identifier>CODEN: ITMRBT</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Amputees ; Artificial neural networks ; Classification ; convolutional neural network ; Convolutional neural networks ; Datasets ; Deep learning ; Electromyography ; EMG ; Feature extraction ; Machine learning ; Myoelectricity ; Pattern classification ; Pattern recognition ; Prostheses ; Prosthetic limbs ; Signal processing ; upper-limb</subject><ispartof>IEEE transactions on medical robotics and bionics, 2022-11, Vol.4 (4), p.991-999</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c223t-78797cc2710de534eb6e221513cb8426b644e2437e32bbc72f790a7f5cf205f83</citedby><cites>FETCH-LOGICAL-c223t-78797cc2710de534eb6e221513cb8426b644e2437e32bbc72f790a7f5cf205f83</cites><orcidid>0000-0003-3450-9617 ; 0000-0001-7919-1652 ; 0000-0002-7978-6493</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9928383$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9928383$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Pancholi, Sidharth</creatorcontrib><creatorcontrib>Joshi, Amit M.</creatorcontrib><creatorcontrib>Joshi, Deepak</creatorcontrib><title>DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control</title><title>IEEE transactions on medical robotics and bionics</title><addtitle>TMRB</addtitle><description>In EMG based pattern recognition (EMG-PR), deep learning-based techniques have become more prominent for their self-regulating capability to extract discriminant features from large data-sets. Moreover, the performance of traditional machine learning-based methods show limitation to categorize over a certain number of classes and degrades over a period of time. In this paper, an accurate, robust, and fast convolutional neural network-based framework for EMG pattern identification is presented. To assess the performance of the proposed system, five publicly available and benchmark data-sets of upper limb activities were used. This data-set contains 49 to 52 upper limb motions (NinaPro DB1, NinaPro DB2, and NinaPro DB3), Data with force variation, and data with arm position variation for intact and amputated subjects. The classification accuracies of 92.18% (53 classes), 91.56% (49 classes), 84.98% (49 classes of amputees), 95.67% (6 classes with force variation), and 99.12% (8 classes with arm position variation) have been observed during the testing. The performance of the proposed system is compared with the state of art techniques in the literature. The findings demonstrate that classification accuracy and time complexity have been improved significantly. For signal pre-processing and deep learning techniques, Keras which is a high-level API for TensorFlow is utilised to build profound learning models. The proposed method has been tested on the Intel Core i7 3.5GHz, 7th Gen CPU with 8GB DDR4 RAM.</description><subject>Amputees</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>convolutional neural network</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Electromyography</subject><subject>EMG</subject><subject>Feature extraction</subject><subject>Machine learning</subject><subject>Myoelectricity</subject><subject>Pattern classification</subject><subject>Pattern recognition</subject><subject>Prostheses</subject><subject>Prosthetic limbs</subject><subject>Signal processing</subject><subject>upper-limb</subject><issn>2576-3202</issn><issn>2576-3202</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkF9LwzAUxYsoOOY-gPgS8LkzuWmb1jf3TwcdjjHxsaTZ7da5JjPphH17MzbEp3u4nHPv4RcE94z2GaPZ03K2GPSBAvQ5sCSLxVXQgVgkIffL63_6Nug5t6WUAoup4EknaEb5fPFMRoh7kqO0utbrcCAdrshYb6RWXsxl26LVZIHKrHXd1kaTiZUNhp_GfpHKWDJt9tb8eO_saHCHqrW1InNrXLtBVzsyNLq1ZncX3FRy57B3md3gYzJeDt_C_P11OnzJQwXA21CkIhNKgWB0hTGPsEwQfGXGVZlGkJRJFCFEXCCHslQCKpFRKapYVUDjKuXd4PF817f6PqBri605WO1fFiC4iJOMM-Fd7OxSvqizWBV7WzfSHgtGixPY4gS2OIEtLmB95uGcqRHxz59lkPKU8198UHOe</recordid><startdate>20221101</startdate><enddate>20221101</enddate><creator>Pancholi, Sidharth</creator><creator>Joshi, Amit M.</creator><creator>Joshi, Deepak</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>K9.</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0003-3450-9617</orcidid><orcidid>https://orcid.org/0000-0001-7919-1652</orcidid><orcidid>https://orcid.org/0000-0002-7978-6493</orcidid></search><sort><creationdate>20221101</creationdate><title>DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control</title><author>Pancholi, Sidharth ; Joshi, Amit M. ; Joshi, Deepak</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c223t-78797cc2710de534eb6e221513cb8426b644e2437e32bbc72f790a7f5cf205f83</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Amputees</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>convolutional neural network</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Electromyography</topic><topic>EMG</topic><topic>Feature extraction</topic><topic>Machine learning</topic><topic>Myoelectricity</topic><topic>Pattern classification</topic><topic>Pattern recognition</topic><topic>Prostheses</topic><topic>Prosthetic limbs</topic><topic>Signal processing</topic><topic>upper-limb</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Pancholi, Sidharth</creatorcontrib><creatorcontrib>Joshi, Amit M.</creatorcontrib><creatorcontrib>Joshi, Deepak</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on medical robotics and bionics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Pancholi, Sidharth</au><au>Joshi, Amit M.</au><au>Joshi, Deepak</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control</atitle><jtitle>IEEE transactions on medical robotics and bionics</jtitle><stitle>TMRB</stitle><date>2022-11-01</date><risdate>2022</risdate><volume>4</volume><issue>4</issue><spage>991</spage><epage>999</epage><pages>991-999</pages><issn>2576-3202</issn><eissn>2576-3202</eissn><coden>ITMRBT</coden><abstract>In EMG based pattern recognition (EMG-PR), deep learning-based techniques have become more prominent for their self-regulating capability to extract discriminant features from large data-sets. Moreover, the performance of traditional machine learning-based methods show limitation to categorize over a certain number of classes and degrades over a period of time. In this paper, an accurate, robust, and fast convolutional neural network-based framework for EMG pattern identification is presented. To assess the performance of the proposed system, five publicly available and benchmark data-sets of upper limb activities were used. This data-set contains 49 to 52 upper limb motions (NinaPro DB1, NinaPro DB2, and NinaPro DB3), Data with force variation, and data with arm position variation for intact and amputated subjects. The classification accuracies of 92.18% (53 classes), 91.56% (49 classes), 84.98% (49 classes of amputees), 95.67% (6 classes with force variation), and 99.12% (8 classes with arm position variation) have been observed during the testing. The performance of the proposed system is compared with the state of art techniques in the literature. The findings demonstrate that classification accuracy and time complexity have been improved significantly. For signal pre-processing and deep learning techniques, Keras which is a high-level API for TensorFlow is utilised to build profound learning models. The proposed method has been tested on the Intel Core i7 3.5GHz, 7th Gen CPU with 8GB DDR4 RAM.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TMRB.2022.3216957</doi><tpages>9</tpages><orcidid>https://orcid.org/0000-0003-3450-9617</orcidid><orcidid>https://orcid.org/0000-0001-7919-1652</orcidid><orcidid>https://orcid.org/0000-0002-7978-6493</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2576-3202
ispartof IEEE transactions on medical robotics and bionics, 2022-11, Vol.4 (4), p.991-999
issn 2576-3202
2576-3202
language eng
recordid cdi_proquest_journals_2737569317
source IEEE Electronic Library (IEL)
subjects Amputees
Artificial neural networks
Classification
convolutional neural network
Convolutional neural networks
Datasets
Deep learning
Electromyography
EMG
Feature extraction
Machine learning
Myoelectricity
Pattern classification
Pattern recognition
Prostheses
Prosthetic limbs
Signal processing
upper-limb
title DLPR: Deep Learning-Based Enhanced Pattern Recognition Frame-Work for Improved Myoelectric Prosthesis Control
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-06T01%3A50%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DLPR:%20Deep%20Learning-Based%20Enhanced%20Pattern%20Recognition%20Frame-Work%20for%20Improved%20Myoelectric%20Prosthesis%20Control&rft.jtitle=IEEE%20transactions%20on%20medical%20robotics%20and%20bionics&rft.au=Pancholi,%20Sidharth&rft.date=2022-11-01&rft.volume=4&rft.issue=4&rft.spage=991&rft.epage=999&rft.pages=991-999&rft.issn=2576-3202&rft.eissn=2576-3202&rft.coden=ITMRBT&rft_id=info:doi/10.1109/TMRB.2022.3216957&rft_dat=%3Cproquest_RIE%3E2737569317%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2737569317&rft_id=info:pmid/&rft_ieee_id=9928383&rfr_iscdi=true