Active man-machine cooperation method based on multi-modal information of human body

The invention discloses an active man-machine cooperation method based on multi-modal information of a human body. The method comprises the following steps: collecting signals of a human and a robot which work cooperatively; inputting the joint angle signal into a constructed prediction neural netwo...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: CHU HUBO, ZHANG TIE, ZOU BIAO
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator CHU HUBO
ZHANG TIE
ZOU BIAO
description The invention discloses an active man-machine cooperation method based on multi-modal information of a human body. The method comprises the following steps: collecting signals of a human and a robot which work cooperatively; inputting the joint angle signal into a constructed prediction neural network for processing to obtain a motion prediction track; the electromyographic signals are input into a constructed estimation neural network to be processed, and three-dimensional arm strength is obtained; inputting the three-dimensional arm power into a refined sensing interface of a human cooperation state to obtain a cooperation comfort index and a stability quantitative index; weighting the collaborative comfort index and the stability quantitative index to obtain an optimization target, obtaining an optimal robot impedance parameter by using an exploration iterative algorithm, and setting an impedance model of the robot based on the impedance parameter to obtain an expected trajectory; and a PD position control
format Patent
fullrecord <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN117621051A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN117621051A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN117621051A3</originalsourceid><addsrcrecordid>eNqNjEEKwjAQAHPxIOof1gcEjKKeS7F48tR72SYbEkiyoUkL_l5FH-BpGBhmLfpGV78QREwyonY-EWjmTBNWzwkiVccGRixk4ONzqF5GNhjAJ8tT_HZswc3vCYxsnluxshgK7X7ciH1369u7pMwDlYyaEtWhfSh1vRzV4aya0z_NC46gOKw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Active man-machine cooperation method based on multi-modal information of human body</title><source>esp@cenet</source><creator>CHU HUBO ; ZHANG TIE ; ZOU BIAO</creator><creatorcontrib>CHU HUBO ; ZHANG TIE ; ZOU BIAO</creatorcontrib><description>The invention discloses an active man-machine cooperation method based on multi-modal information of a human body. The method comprises the following steps: collecting signals of a human and a robot which work cooperatively; inputting the joint angle signal into a constructed prediction neural network for processing to obtain a motion prediction track; the electromyographic signals are input into a constructed estimation neural network to be processed, and three-dimensional arm strength is obtained; inputting the three-dimensional arm power into a refined sensing interface of a human cooperation state to obtain a cooperation comfort index and a stability quantitative index; weighting the collaborative comfort index and the stability quantitative index to obtain an optimization target, obtaining an optimal robot impedance parameter by using an exploration iterative algorithm, and setting an impedance model of the robot based on the impedance parameter to obtain an expected trajectory; and a PD position control</description><language>chi ; eng</language><subject>CALCULATING ; CHAMBERS PROVIDED WITH MANIPULATION DEVICES ; COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS ; COMPUTING ; COUNTING ; HAND TOOLS ; MANIPULATORS ; PERFORMING OPERATIONS ; PHYSICS ; PORTABLE POWER-DRIVEN TOOLS ; TRANSPORTING</subject><creationdate>2024</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240301&amp;DB=EPODOC&amp;CC=CN&amp;NR=117621051A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&amp;date=20240301&amp;DB=EPODOC&amp;CC=CN&amp;NR=117621051A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>CHU HUBO</creatorcontrib><creatorcontrib>ZHANG TIE</creatorcontrib><creatorcontrib>ZOU BIAO</creatorcontrib><title>Active man-machine cooperation method based on multi-modal information of human body</title><description>The invention discloses an active man-machine cooperation method based on multi-modal information of a human body. The method comprises the following steps: collecting signals of a human and a robot which work cooperatively; inputting the joint angle signal into a constructed prediction neural network for processing to obtain a motion prediction track; the electromyographic signals are input into a constructed estimation neural network to be processed, and three-dimensional arm strength is obtained; inputting the three-dimensional arm power into a refined sensing interface of a human cooperation state to obtain a cooperation comfort index and a stability quantitative index; weighting the collaborative comfort index and the stability quantitative index to obtain an optimization target, obtaining an optimal robot impedance parameter by using an exploration iterative algorithm, and setting an impedance model of the robot based on the impedance parameter to obtain an expected trajectory; and a PD position control</description><subject>CALCULATING</subject><subject>CHAMBERS PROVIDED WITH MANIPULATION DEVICES</subject><subject>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</subject><subject>COMPUTING</subject><subject>COUNTING</subject><subject>HAND TOOLS</subject><subject>MANIPULATORS</subject><subject>PERFORMING OPERATIONS</subject><subject>PHYSICS</subject><subject>PORTABLE POWER-DRIVEN TOOLS</subject><subject>TRANSPORTING</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2024</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNjEEKwjAQAHPxIOof1gcEjKKeS7F48tR72SYbEkiyoUkL_l5FH-BpGBhmLfpGV78QREwyonY-EWjmTBNWzwkiVccGRixk4ONzqF5GNhjAJ8tT_HZswc3vCYxsnluxshgK7X7ciH1369u7pMwDlYyaEtWhfSh1vRzV4aya0z_NC46gOKw</recordid><startdate>20240301</startdate><enddate>20240301</enddate><creator>CHU HUBO</creator><creator>ZHANG TIE</creator><creator>ZOU BIAO</creator><scope>EVB</scope></search><sort><creationdate>20240301</creationdate><title>Active man-machine cooperation method based on multi-modal information of human body</title><author>CHU HUBO ; ZHANG TIE ; ZOU BIAO</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN117621051A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2024</creationdate><topic>CALCULATING</topic><topic>CHAMBERS PROVIDED WITH MANIPULATION DEVICES</topic><topic>COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS</topic><topic>COMPUTING</topic><topic>COUNTING</topic><topic>HAND TOOLS</topic><topic>MANIPULATORS</topic><topic>PERFORMING OPERATIONS</topic><topic>PHYSICS</topic><topic>PORTABLE POWER-DRIVEN TOOLS</topic><topic>TRANSPORTING</topic><toplevel>online_resources</toplevel><creatorcontrib>CHU HUBO</creatorcontrib><creatorcontrib>ZHANG TIE</creatorcontrib><creatorcontrib>ZOU BIAO</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>CHU HUBO</au><au>ZHANG TIE</au><au>ZOU BIAO</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Active man-machine cooperation method based on multi-modal information of human body</title><date>2024-03-01</date><risdate>2024</risdate><abstract>The invention discloses an active man-machine cooperation method based on multi-modal information of a human body. The method comprises the following steps: collecting signals of a human and a robot which work cooperatively; inputting the joint angle signal into a constructed prediction neural network for processing to obtain a motion prediction track; the electromyographic signals are input into a constructed estimation neural network to be processed, and three-dimensional arm strength is obtained; inputting the three-dimensional arm power into a refined sensing interface of a human cooperation state to obtain a cooperation comfort index and a stability quantitative index; weighting the collaborative comfort index and the stability quantitative index to obtain an optimization target, obtaining an optimal robot impedance parameter by using an exploration iterative algorithm, and setting an impedance model of the robot based on the impedance parameter to obtain an expected trajectory; and a PD position control</abstract><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof
issn
language chi ; eng
recordid cdi_epo_espacenet_CN117621051A
source esp@cenet
subjects CALCULATING
CHAMBERS PROVIDED WITH MANIPULATION DEVICES
COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
COMPUTING
COUNTING
HAND TOOLS
MANIPULATORS
PERFORMING OPERATIONS
PHYSICS
PORTABLE POWER-DRIVEN TOOLS
TRANSPORTING
title Active man-machine cooperation method based on multi-modal information of human body
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T22%3A59%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=CHU%20HUBO&rft.date=2024-03-01&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN117621051A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true