A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons

Accurate intent recognition by patients while wearing exoskeletons is crucial during their rehabilitation exercises. In this article, a transfer learning framework for human-robot interaction (EMGTnet-KTD) is proposed to predict human movement intentions in human-robot interactions through surface e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE sensors journal 2024-12, Vol.24 (23), p.39490-39502
Hauptverfasser: Yang, Ming, Tian, Dingkui, Li, Feng, Chen, Ziqiang, Zhu, Yuanpei, Shang, Weiwei, Zhang, Li, Wu, Xinyu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 39502
container_issue 23
container_start_page 39490
container_title IEEE sensors journal
container_volume 24
creator Yang, Ming
Tian, Dingkui
Li, Feng
Chen, Ziqiang
Zhu, Yuanpei
Shang, Weiwei
Zhang, Li
Wu, Xinyu
description Accurate intent recognition by patients while wearing exoskeletons is crucial during their rehabilitation exercises. In this article, a transfer learning framework for human-robot interaction (EMGTnet-KTD) is proposed to predict human movement intentions in human-robot interactions through surface electromyography (sEMG) signals. EMGTnet-KTD consists of a pretrained EMGTnet model and a knowledge transfer module. First, EMGTnet is designed based on a Transformer network. A temporal and spatial domain feature fusion module has been introduced on top of the Transformer network, and the inputs have been reconfigured to enable it to utilize the relationship between before and after human actions. In addition, the knowledge transfer module is composed of a feature extraction layer, a noise reduction layer, and the personalized human lower limb dynamics controller. To evaluate the effectiveness of the proposed method, an experimental validation of our self-collected dataset from seven subjects is performed. The results show that our method achieves better results than other continuous motion prediction methods. Finally, to validate that the generation angle conforms to human physiology, walking experiments involving the use of an exoskeleton are conducted. The experiments demonstrate the effectiveness of the proposed framework and its implementability for exoskeletons.
doi_str_mv 10.1109/JSEN.2024.3479239
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_JSEN_2024_3479239</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10723275</ieee_id><sourcerecordid>3133498152</sourcerecordid><originalsourceid>FETCH-LOGICAL-c176t-bfb9eccca9fceb60385af86b577063f4e21352892e25b6284353a668372af5553</originalsourceid><addsrcrecordid>eNpNkMtKAzEUhoMoWKsPILgIuJ6ay2SSWdZSbbVe0AruQmZ6olOnk5qkVH16Z6gLN-cC33_gfAidUjKglOQXN8_j-wEjLB3wVOaM53uoR4VQCZWp2u9mTpKUy9dDdBTCkhCaSyF7aDnEt43b1rB4Azz3pgkWfHJpAizwI_jgGlNXP-0y2axMkzy5wkU8bSJ4U8bKNXjkmuhdje8gvrsFts7jmdtCW6tVgcdfLnxADdE14RgdWFMHOPnrffRyNZ6PJsns4Xo6Gs6SksosJoUtcijL0uS2hCIjXAljVVYIKUnGbQqMcsFUzoCJImMq5YKbLFNcMmOFELyPznd31959biBEvXQb3_4RNKecp7migrUU3VGldyF4sHrtq5Xx35oS3SnVnVLdKdV_StvM2S5TAcA_XjLOpOC_4ftzCg</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3133498152</pqid></control><display><type>article</type><title>A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons</title><source>IEEE Electronic Library (IEL)</source><creator>Yang, Ming ; Tian, Dingkui ; Li, Feng ; Chen, Ziqiang ; Zhu, Yuanpei ; Shang, Weiwei ; Zhang, Li ; Wu, Xinyu</creator><creatorcontrib>Yang, Ming ; Tian, Dingkui ; Li, Feng ; Chen, Ziqiang ; Zhu, Yuanpei ; Shang, Weiwei ; Zhang, Li ; Wu, Xinyu</creatorcontrib><description>Accurate intent recognition by patients while wearing exoskeletons is crucial during their rehabilitation exercises. In this article, a transfer learning framework for human-robot interaction (EMGTnet-KTD) is proposed to predict human movement intentions in human-robot interactions through surface electromyography (sEMG) signals. EMGTnet-KTD consists of a pretrained EMGTnet model and a knowledge transfer module. First, EMGTnet is designed based on a Transformer network. A temporal and spatial domain feature fusion module has been introduced on top of the Transformer network, and the inputs have been reconfigured to enable it to utilize the relationship between before and after human actions. In addition, the knowledge transfer module is composed of a feature extraction layer, a noise reduction layer, and the personalized human lower limb dynamics controller. To evaluate the effectiveness of the proposed method, an experimental validation of our self-collected dataset from seven subjects is performed. The results show that our method achieves better results than other continuous motion prediction methods. Finally, to validate that the generation angle conforms to human physiology, walking experiments involving the use of an exoskeleton are conducted. The experiments demonstrate the effectiveness of the proposed framework and its implementability for exoskeletons.</description><identifier>ISSN: 1530-437X</identifier><identifier>EISSN: 1558-1748</identifier><identifier>DOI: 10.1109/JSEN.2024.3479239</identifier><identifier>CODEN: ISJEAZ</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Accuracy ; Control methods ; Customization ; Effectiveness ; Exoskeleton ; Exoskeletons ; Feature extraction ; Human engineering ; Human motion ; Human-robot interaction ; Intent recognition ; Knowledge management ; Knowledge transfer ; Legged locomotion ; Modules ; Muscles ; Noise prediction ; personalized intent recognition ; Robot control ; Robots ; Sensors ; surface electromyography (sEMG) ; Transfer learning ; Transformers</subject><ispartof>IEEE sensors journal, 2024-12, Vol.24 (23), p.39490-39502</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c176t-bfb9eccca9fceb60385af86b577063f4e21352892e25b6284353a668372af5553</cites><orcidid>0009-0004-7814-1513 ; 0000-0001-7541-2198 ; 0009-0000-6997-2586 ; 0000-0001-6130-7821 ; 0009-0004-6575-1128 ; 0009-0001-6888-2084 ; 0000-0003-1152-8962</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10723275$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10723275$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Yang, Ming</creatorcontrib><creatorcontrib>Tian, Dingkui</creatorcontrib><creatorcontrib>Li, Feng</creatorcontrib><creatorcontrib>Chen, Ziqiang</creatorcontrib><creatorcontrib>Zhu, Yuanpei</creatorcontrib><creatorcontrib>Shang, Weiwei</creatorcontrib><creatorcontrib>Zhang, Li</creatorcontrib><creatorcontrib>Wu, Xinyu</creatorcontrib><title>A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons</title><title>IEEE sensors journal</title><addtitle>JSEN</addtitle><description>Accurate intent recognition by patients while wearing exoskeletons is crucial during their rehabilitation exercises. In this article, a transfer learning framework for human-robot interaction (EMGTnet-KTD) is proposed to predict human movement intentions in human-robot interactions through surface electromyography (sEMG) signals. EMGTnet-KTD consists of a pretrained EMGTnet model and a knowledge transfer module. First, EMGTnet is designed based on a Transformer network. A temporal and spatial domain feature fusion module has been introduced on top of the Transformer network, and the inputs have been reconfigured to enable it to utilize the relationship between before and after human actions. In addition, the knowledge transfer module is composed of a feature extraction layer, a noise reduction layer, and the personalized human lower limb dynamics controller. To evaluate the effectiveness of the proposed method, an experimental validation of our self-collected dataset from seven subjects is performed. The results show that our method achieves better results than other continuous motion prediction methods. Finally, to validate that the generation angle conforms to human physiology, walking experiments involving the use of an exoskeleton are conducted. The experiments demonstrate the effectiveness of the proposed framework and its implementability for exoskeletons.</description><subject>Accuracy</subject><subject>Control methods</subject><subject>Customization</subject><subject>Effectiveness</subject><subject>Exoskeleton</subject><subject>Exoskeletons</subject><subject>Feature extraction</subject><subject>Human engineering</subject><subject>Human motion</subject><subject>Human-robot interaction</subject><subject>Intent recognition</subject><subject>Knowledge management</subject><subject>Knowledge transfer</subject><subject>Legged locomotion</subject><subject>Modules</subject><subject>Muscles</subject><subject>Noise prediction</subject><subject>personalized intent recognition</subject><subject>Robot control</subject><subject>Robots</subject><subject>Sensors</subject><subject>surface electromyography (sEMG)</subject><subject>Transfer learning</subject><subject>Transformers</subject><issn>1530-437X</issn><issn>1558-1748</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkMtKAzEUhoMoWKsPILgIuJ6ay2SSWdZSbbVe0AruQmZ6olOnk5qkVH16Z6gLN-cC33_gfAidUjKglOQXN8_j-wEjLB3wVOaM53uoR4VQCZWp2u9mTpKUy9dDdBTCkhCaSyF7aDnEt43b1rB4Azz3pgkWfHJpAizwI_jgGlNXP-0y2axMkzy5wkU8bSJ4U8bKNXjkmuhdje8gvrsFts7jmdtCW6tVgcdfLnxADdE14RgdWFMHOPnrffRyNZ6PJsns4Xo6Gs6SksosJoUtcijL0uS2hCIjXAljVVYIKUnGbQqMcsFUzoCJImMq5YKbLFNcMmOFELyPznd31959biBEvXQb3_4RNKecp7migrUU3VGldyF4sHrtq5Xx35oS3SnVnVLdKdV_StvM2S5TAcA_XjLOpOC_4ftzCg</recordid><startdate>20241201</startdate><enddate>20241201</enddate><creator>Yang, Ming</creator><creator>Tian, Dingkui</creator><creator>Li, Feng</creator><creator>Chen, Ziqiang</creator><creator>Zhu, Yuanpei</creator><creator>Shang, Weiwei</creator><creator>Zhang, Li</creator><creator>Wu, Xinyu</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0009-0004-7814-1513</orcidid><orcidid>https://orcid.org/0000-0001-7541-2198</orcidid><orcidid>https://orcid.org/0009-0000-6997-2586</orcidid><orcidid>https://orcid.org/0000-0001-6130-7821</orcidid><orcidid>https://orcid.org/0009-0004-6575-1128</orcidid><orcidid>https://orcid.org/0009-0001-6888-2084</orcidid><orcidid>https://orcid.org/0000-0003-1152-8962</orcidid></search><sort><creationdate>20241201</creationdate><title>A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons</title><author>Yang, Ming ; Tian, Dingkui ; Li, Feng ; Chen, Ziqiang ; Zhu, Yuanpei ; Shang, Weiwei ; Zhang, Li ; Wu, Xinyu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c176t-bfb9eccca9fceb60385af86b577063f4e21352892e25b6284353a668372af5553</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Control methods</topic><topic>Customization</topic><topic>Effectiveness</topic><topic>Exoskeleton</topic><topic>Exoskeletons</topic><topic>Feature extraction</topic><topic>Human engineering</topic><topic>Human motion</topic><topic>Human-robot interaction</topic><topic>Intent recognition</topic><topic>Knowledge management</topic><topic>Knowledge transfer</topic><topic>Legged locomotion</topic><topic>Modules</topic><topic>Muscles</topic><topic>Noise prediction</topic><topic>personalized intent recognition</topic><topic>Robot control</topic><topic>Robots</topic><topic>Sensors</topic><topic>surface electromyography (sEMG)</topic><topic>Transfer learning</topic><topic>Transformers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yang, Ming</creatorcontrib><creatorcontrib>Tian, Dingkui</creatorcontrib><creatorcontrib>Li, Feng</creatorcontrib><creatorcontrib>Chen, Ziqiang</creatorcontrib><creatorcontrib>Zhu, Yuanpei</creatorcontrib><creatorcontrib>Shang, Weiwei</creatorcontrib><creatorcontrib>Zhang, Li</creatorcontrib><creatorcontrib>Wu, Xinyu</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE sensors journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Yang, Ming</au><au>Tian, Dingkui</au><au>Li, Feng</au><au>Chen, Ziqiang</au><au>Zhu, Yuanpei</au><au>Shang, Weiwei</au><au>Zhang, Li</au><au>Wu, Xinyu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons</atitle><jtitle>IEEE sensors journal</jtitle><stitle>JSEN</stitle><date>2024-12-01</date><risdate>2024</risdate><volume>24</volume><issue>23</issue><spage>39490</spage><epage>39502</epage><pages>39490-39502</pages><issn>1530-437X</issn><eissn>1558-1748</eissn><coden>ISJEAZ</coden><abstract>Accurate intent recognition by patients while wearing exoskeletons is crucial during their rehabilitation exercises. In this article, a transfer learning framework for human-robot interaction (EMGTnet-KTD) is proposed to predict human movement intentions in human-robot interactions through surface electromyography (sEMG) signals. EMGTnet-KTD consists of a pretrained EMGTnet model and a knowledge transfer module. First, EMGTnet is designed based on a Transformer network. A temporal and spatial domain feature fusion module has been introduced on top of the Transformer network, and the inputs have been reconfigured to enable it to utilize the relationship between before and after human actions. In addition, the knowledge transfer module is composed of a feature extraction layer, a noise reduction layer, and the personalized human lower limb dynamics controller. To evaluate the effectiveness of the proposed method, an experimental validation of our self-collected dataset from seven subjects is performed. The results show that our method achieves better results than other continuous motion prediction methods. Finally, to validate that the generation angle conforms to human physiology, walking experiments involving the use of an exoskeleton are conducted. The experiments demonstrate the effectiveness of the proposed framework and its implementability for exoskeletons.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/JSEN.2024.3479239</doi><tpages>13</tpages><orcidid>https://orcid.org/0009-0004-7814-1513</orcidid><orcidid>https://orcid.org/0000-0001-7541-2198</orcidid><orcidid>https://orcid.org/0009-0000-6997-2586</orcidid><orcidid>https://orcid.org/0000-0001-6130-7821</orcidid><orcidid>https://orcid.org/0009-0004-6575-1128</orcidid><orcidid>https://orcid.org/0009-0001-6888-2084</orcidid><orcidid>https://orcid.org/0000-0003-1152-8962</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1530-437X
ispartof IEEE sensors journal, 2024-12, Vol.24 (23), p.39490-39502
issn 1530-437X
1558-1748
language eng
recordid cdi_crossref_primary_10_1109_JSEN_2024_3479239
source IEEE Electronic Library (IEL)
subjects Accuracy
Control methods
Customization
Effectiveness
Exoskeleton
Exoskeletons
Feature extraction
Human engineering
Human motion
Human-robot interaction
Intent recognition
Knowledge management
Knowledge transfer
Legged locomotion
Modules
Muscles
Noise prediction
personalized intent recognition
Robot control
Robots
Sensors
surface electromyography (sEMG)
Transfer learning
Transformers
title A Knowledge Transfer-Based Personalized Human-Robot Interaction Control Method for Lower Limb Exoskeletons
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-12T23%3A47%3A01IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Knowledge%20Transfer-Based%20Personalized%20Human-Robot%20Interaction%20Control%20Method%20for%20Lower%20Limb%20Exoskeletons&rft.jtitle=IEEE%20sensors%20journal&rft.au=Yang,%20Ming&rft.date=2024-12-01&rft.volume=24&rft.issue=23&rft.spage=39490&rft.epage=39502&rft.pages=39490-39502&rft.issn=1530-437X&rft.eissn=1558-1748&rft.coden=ISJEAZ&rft_id=info:doi/10.1109/JSEN.2024.3479239&rft_dat=%3Cproquest_RIE%3E3133498152%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3133498152&rft_id=info:pmid/&rft_ieee_id=10723275&rfr_iscdi=true