Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System

The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measure...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on neural systems and rehabilitation engineering 2024, Vol.32, p.3953-3965
Hauptverfasser: Choi, Ahnryul, Hyong Kim, Tae, Chae, Seungheon, Hwan Mun, Joung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3965
container_issue
container_start_page 3953
container_title IEEE transactions on neural systems and rehabilitation engineering
container_volume 32
creator Choi, Ahnryul
Hyong Kim, Tae
Chae, Seungheon
Hwan Mun, Joung
description The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measurement units (IMUs), were combined to capture biometric signals during arm-up, hold, and arm-down movements. To distinguish motion intentions, deep learning models were constructed using the CIFAR-ResNet18 and CIFAR-MobileNetV2 architectures. The input features of the source models were sEMG, FSR, and IMU signals. The target model was trained using only FSR and IMU sensor signals. Optimization techniques determined appropriate layer structures and learning rates of each layer for effective transfer learning. The source model on CIFAR-ResNet18 exhibited the highest performance, achieving an accuracy of 95% and an F-1 score of 0.95. The target model with optimization strategies performed comparably to the source model, achieving an accuracy of 93% and an F-1 score of 0.93. The results show that mechanical sensors alone can achieve performance comparable to models including sEMG. The proposed approach can serve as a convenient and precise algorithm for human-robot collaboration in rehabilitation assistant robots.
doi_str_mv 10.1109/TNSRE.2024.3486444
format Article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10735240</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10735240</ieee_id><doaj_id>oai_doaj_org_article_99952d8eb4f44b608351b627787359f4</doaj_id><sourcerecordid>3120910784</sourcerecordid><originalsourceid>FETCH-LOGICAL-c315t-62ffbcadf1edc09518c09d91002acff546a7e4aea4b0b0945f7a18035c1e98813</originalsourceid><addsrcrecordid>eNpNkc1uEzEUhUcIREvhBRBCXrKZ4N8Ze4naAJFSkJpkbXk8163LjCfYLqJPwGvjSULFxtc_53y-9qmqtwQvCMHq4_bb5ma5oJjyBeOy4Zw_q86JELLGlODn85zxmjOKz6pXKd1jTNpGtC-rM6a4YK1qzqs_q3Efp1_Qo200ITmIaA0mBh9ukZsiuoIMNs-r3X4PsV77sUPXxTBCyGgVcil-CmiXZs012DsTvDUD2kBIU0zIB2QCWv6e0g8YIJeTG7gznR98Ngfn5jFlGF9XL5wZErw51Ytq93m5vfxar79_WV1-WteWEZHrhjrXWdM7Ar3FShBZxl4RjKmxzgnemBa4AcM73OHyStcaIjETloCSkrCLanXk9pO51_voRxMf9WS8PmxM8VabmL0dQCulBO0ldNxx3jVYMkG6hratbJlQjhfWhyOr_ODPB0hZjz5ZGAYTYHpImhGKS2utnKX0KLVxSimCe7qaYD2HqQ9h6jlMfQqzmN6f-A_dCP2T5V96RfDuKPAA8B-x9Ec5Zn8BdOGlGQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3120910784</pqid></control><display><type>article</type><title>Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System</title><source>MEDLINE</source><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>Choi, Ahnryul ; Hyong Kim, Tae ; Chae, Seungheon ; Hwan Mun, Joung</creator><creatorcontrib>Choi, Ahnryul ; Hyong Kim, Tae ; Chae, Seungheon ; Hwan Mun, Joung</creatorcontrib><description>The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measurement units (IMUs), were combined to capture biometric signals during arm-up, hold, and arm-down movements. To distinguish motion intentions, deep learning models were constructed using the CIFAR-ResNet18 and CIFAR-MobileNetV2 architectures. The input features of the source models were sEMG, FSR, and IMU signals. The target model was trained using only FSR and IMU sensor signals. Optimization techniques determined appropriate layer structures and learning rates of each layer for effective transfer learning. The source model on CIFAR-ResNet18 exhibited the highest performance, achieving an accuracy of 95% and an F-1 score of 0.95. The target model with optimization strategies performed comparably to the source model, achieving an accuracy of 93% and an F-1 score of 0.93. The results show that mechanical sensors alone can achieve performance comparable to models including sEMG. The proposed approach can serve as a convenient and precise algorithm for human-robot collaboration in rehabilitation assistant robots.</description><identifier>ISSN: 1534-4320</identifier><identifier>ISSN: 1558-0210</identifier><identifier>EISSN: 1558-0210</identifier><identifier>DOI: 10.1109/TNSRE.2024.3486444</identifier><identifier>PMID: 39453796</identifier><identifier>CODEN: ITNSB3</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Accuracy ; Adult ; Algorithms ; Biomechanical Phenomena ; Brain modeling ; CIFAR-MobileNetV2 ; CIFAR-ResNet18 ; Classification algorithms ; Data models ; Deep Learning ; Electromyography ; Electromyography - methods ; exoskeletal rehabilitation system ; Exoskeleton Device ; Exoskeletons ; Female ; Humans ; Intention ; Machine Learning ; Male ; mechanical sensor ; Mechanical sensors ; Movement - physiology ; Neural Networks, Computer ; Predictive models ; Robot sensing systems ; Transfer learning ; Transfer, Psychology ; Upper Extremity - physiology ; Upper-limb motion intention ; Young Adult</subject><ispartof>IEEE transactions on neural systems and rehabilitation engineering, 2024, Vol.32, p.3953-3965</ispartof><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c315t-62ffbcadf1edc09518c09d91002acff546a7e4aea4b0b0945f7a18035c1e98813</cites><orcidid>0000-0003-4213-785X ; 0000-0001-7355-3886</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,860,2095,4009,27902,27903,27904</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39453796$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Choi, Ahnryul</creatorcontrib><creatorcontrib>Hyong Kim, Tae</creatorcontrib><creatorcontrib>Chae, Seungheon</creatorcontrib><creatorcontrib>Hwan Mun, Joung</creatorcontrib><title>Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System</title><title>IEEE transactions on neural systems and rehabilitation engineering</title><addtitle>TNSRE</addtitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><description>The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measurement units (IMUs), were combined to capture biometric signals during arm-up, hold, and arm-down movements. To distinguish motion intentions, deep learning models were constructed using the CIFAR-ResNet18 and CIFAR-MobileNetV2 architectures. The input features of the source models were sEMG, FSR, and IMU signals. The target model was trained using only FSR and IMU sensor signals. Optimization techniques determined appropriate layer structures and learning rates of each layer for effective transfer learning. The source model on CIFAR-ResNet18 exhibited the highest performance, achieving an accuracy of 95% and an F-1 score of 0.95. The target model with optimization strategies performed comparably to the source model, achieving an accuracy of 93% and an F-1 score of 0.93. The results show that mechanical sensors alone can achieve performance comparable to models including sEMG. The proposed approach can serve as a convenient and precise algorithm for human-robot collaboration in rehabilitation assistant robots.</description><subject>Accuracy</subject><subject>Adult</subject><subject>Algorithms</subject><subject>Biomechanical Phenomena</subject><subject>Brain modeling</subject><subject>CIFAR-MobileNetV2</subject><subject>CIFAR-ResNet18</subject><subject>Classification algorithms</subject><subject>Data models</subject><subject>Deep Learning</subject><subject>Electromyography</subject><subject>Electromyography - methods</subject><subject>exoskeletal rehabilitation system</subject><subject>Exoskeleton Device</subject><subject>Exoskeletons</subject><subject>Female</subject><subject>Humans</subject><subject>Intention</subject><subject>Machine Learning</subject><subject>Male</subject><subject>mechanical sensor</subject><subject>Mechanical sensors</subject><subject>Movement - physiology</subject><subject>Neural Networks, Computer</subject><subject>Predictive models</subject><subject>Robot sensing systems</subject><subject>Transfer learning</subject><subject>Transfer, Psychology</subject><subject>Upper Extremity - physiology</subject><subject>Upper-limb motion intention</subject><subject>Young Adult</subject><issn>1534-4320</issn><issn>1558-0210</issn><issn>1558-0210</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>EIF</sourceid><sourceid>DOA</sourceid><recordid>eNpNkc1uEzEUhUcIREvhBRBCXrKZ4N8Ze4naAJFSkJpkbXk8163LjCfYLqJPwGvjSULFxtc_53y-9qmqtwQvCMHq4_bb5ma5oJjyBeOy4Zw_q86JELLGlODn85zxmjOKz6pXKd1jTNpGtC-rM6a4YK1qzqs_q3Efp1_Qo200ITmIaA0mBh9ukZsiuoIMNs-r3X4PsV77sUPXxTBCyGgVcil-CmiXZs012DsTvDUD2kBIU0zIB2QCWv6e0g8YIJeTG7gznR98Ngfn5jFlGF9XL5wZErw51Ytq93m5vfxar79_WV1-WteWEZHrhjrXWdM7Ar3FShBZxl4RjKmxzgnemBa4AcM73OHyStcaIjETloCSkrCLanXk9pO51_voRxMf9WS8PmxM8VabmL0dQCulBO0ldNxx3jVYMkG6hratbJlQjhfWhyOr_ODPB0hZjz5ZGAYTYHpImhGKS2utnKX0KLVxSimCe7qaYD2HqQ9h6jlMfQqzmN6f-A_dCP2T5V96RfDuKPAA8B-x9Ec5Zn8BdOGlGQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Choi, Ahnryul</creator><creator>Hyong Kim, Tae</creator><creator>Chae, Seungheon</creator><creator>Hwan Mun, Joung</creator><general>IEEE</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-4213-785X</orcidid><orcidid>https://orcid.org/0000-0001-7355-3886</orcidid></search><sort><creationdate>2024</creationdate><title>Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System</title><author>Choi, Ahnryul ; Hyong Kim, Tae ; Chae, Seungheon ; Hwan Mun, Joung</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c315t-62ffbcadf1edc09518c09d91002acff546a7e4aea4b0b0945f7a18035c1e98813</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Adult</topic><topic>Algorithms</topic><topic>Biomechanical Phenomena</topic><topic>Brain modeling</topic><topic>CIFAR-MobileNetV2</topic><topic>CIFAR-ResNet18</topic><topic>Classification algorithms</topic><topic>Data models</topic><topic>Deep Learning</topic><topic>Electromyography</topic><topic>Electromyography - methods</topic><topic>exoskeletal rehabilitation system</topic><topic>Exoskeleton Device</topic><topic>Exoskeletons</topic><topic>Female</topic><topic>Humans</topic><topic>Intention</topic><topic>Machine Learning</topic><topic>Male</topic><topic>mechanical sensor</topic><topic>Mechanical sensors</topic><topic>Movement - physiology</topic><topic>Neural Networks, Computer</topic><topic>Predictive models</topic><topic>Robot sensing systems</topic><topic>Transfer learning</topic><topic>Transfer, Psychology</topic><topic>Upper Extremity - physiology</topic><topic>Upper-limb motion intention</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Choi, Ahnryul</creatorcontrib><creatorcontrib>Hyong Kim, Tae</creatorcontrib><creatorcontrib>Chae, Seungheon</creatorcontrib><creatorcontrib>Hwan Mun, Joung</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Choi, Ahnryul</au><au>Hyong Kim, Tae</au><au>Chae, Seungheon</au><au>Hwan Mun, Joung</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System</atitle><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle><stitle>TNSRE</stitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><date>2024</date><risdate>2024</risdate><volume>32</volume><spage>3953</spage><epage>3965</epage><pages>3953-3965</pages><issn>1534-4320</issn><issn>1558-0210</issn><eissn>1558-0210</eissn><coden>ITNSB3</coden><abstract>The objective of this study was to propose a novel strategy for detecting upper-limb motion intentions from mechanical sensor signals using deep and heterogeneous transfer learning techniques. Three sensor types, surface electromyography (sEMG), force-sensitive resistors (FSRs), and inertial measurement units (IMUs), were combined to capture biometric signals during arm-up, hold, and arm-down movements. To distinguish motion intentions, deep learning models were constructed using the CIFAR-ResNet18 and CIFAR-MobileNetV2 architectures. The input features of the source models were sEMG, FSR, and IMU signals. The target model was trained using only FSR and IMU sensor signals. Optimization techniques determined appropriate layer structures and learning rates of each layer for effective transfer learning. The source model on CIFAR-ResNet18 exhibited the highest performance, achieving an accuracy of 95% and an F-1 score of 0.95. The target model with optimization strategies performed comparably to the source model, achieving an accuracy of 93% and an F-1 score of 0.93. The results show that mechanical sensors alone can achieve performance comparable to models including sEMG. The proposed approach can serve as a convenient and precise algorithm for human-robot collaboration in rehabilitation assistant robots.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>39453796</pmid><doi>10.1109/TNSRE.2024.3486444</doi><tpages>13</tpages><orcidid>https://orcid.org/0000-0003-4213-785X</orcidid><orcidid>https://orcid.org/0000-0001-7355-3886</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1534-4320
ispartof IEEE transactions on neural systems and rehabilitation engineering, 2024, Vol.32, p.3953-3965
issn 1534-4320
1558-0210
1558-0210
language eng
recordid cdi_ieee_primary_10735240
source MEDLINE; DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Accuracy
Adult
Algorithms
Biomechanical Phenomena
Brain modeling
CIFAR-MobileNetV2
CIFAR-ResNet18
Classification algorithms
Data models
Deep Learning
Electromyography
Electromyography - methods
exoskeletal rehabilitation system
Exoskeleton Device
Exoskeletons
Female
Humans
Intention
Machine Learning
Male
mechanical sensor
Mechanical sensors
Movement - physiology
Neural Networks, Computer
Predictive models
Robot sensing systems
Transfer learning
Transfer, Psychology
Upper Extremity - physiology
Upper-limb motion intention
Young Adult
title Improved Transfer Learning for Detecting Upper-Limb Movement Intention Using Mechanical Sensors in an Exoskeletal Rehabilitation System
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T11%3A00%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Improved%20Transfer%20Learning%20for%20Detecting%20Upper-Limb%20Movement%20Intention%20Using%20Mechanical%20Sensors%20in%20an%20Exoskeletal%20Rehabilitation%20System&rft.jtitle=IEEE%20transactions%20on%20neural%20systems%20and%20rehabilitation%20engineering&rft.au=Choi,%20Ahnryul&rft.date=2024&rft.volume=32&rft.spage=3953&rft.epage=3965&rft.pages=3953-3965&rft.issn=1534-4320&rft.eissn=1558-0210&rft.coden=ITNSB3&rft_id=info:doi/10.1109/TNSRE.2024.3486444&rft_dat=%3Cproquest_ieee_%3E3120910784%3C/proquest_ieee_%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3120910784&rft_id=info:pmid/39453796&rft_ieee_id=10735240&rft_doaj_id=oai_doaj_org_article_99952d8eb4f44b608351b627787359f4&rfr_iscdi=true