Object grabbing method based on time sequence tactile data processing
The invention discloses an object grabbing method based on time sequence tactile data processing. The method comprises the steps that an optimal grabbing area is obtained according to position information of an object; when a mechanical arm runs to the optimal grabbing area, the mechanical arm is co...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Patent |
Sprache: | chi ; eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | ZHU XIAOJUN LIANG BIN ZHOU XINGRU LIU HOUDE WANG XUEQIAN |
description | The invention discloses an object grabbing method based on time sequence tactile data processing. The method comprises the steps that an optimal grabbing area is obtained according to position information of an object; when a mechanical arm runs to the optimal grabbing area, the mechanical arm is controlled to be closed with preset force and stay in a plurality of tactile sensor collection periods, and tactile data, including magnitude and direction information of force applied to the object by the mechanical arm, of the object is collected through a tactile sensor in the staying period; the tactile data of each acquisition period is converted into tactile images, the tactile images arranged according to time are taken as initial network input, cyclic transmission is conducted by using a pre-trained force tracking motion network, and a plurality of frames of tactile images of a future time sequence are predicted; a frame sequence formed by the multiple frames of tactile images is input into a pre-trained LSTM |
format | Patent |
fullrecord | <record><control><sourceid>epo_EVB</sourceid><recordid>TN_cdi_epo_espacenet_CN113172629A</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>CN113172629A</sourcerecordid><originalsourceid>FETCH-epo_espacenet_CN113172629A3</originalsourceid><addsrcrecordid>eNqNyzEOwjAQBVE3FAi4w3IACicSiBJFQVTQ0Edr-5MYJbbJLveHggNQTfNmadqbe8Ir9TM7F1NPE3TIgRwLAuVEGieQ4PVG8iBlr3EEBVamMmcPke-1NosHj4LNryuzPbf35rJDyR2ksEeCds3V2toeqn11PNX_mA_R9TMB</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>patent</recordtype></control><display><type>patent</type><title>Object grabbing method based on time sequence tactile data processing</title><source>esp@cenet</source><creator>ZHU XIAOJUN ; LIANG BIN ; ZHOU XINGRU ; LIU HOUDE ; WANG XUEQIAN</creator><creatorcontrib>ZHU XIAOJUN ; LIANG BIN ; ZHOU XINGRU ; LIU HOUDE ; WANG XUEQIAN</creatorcontrib><description>The invention discloses an object grabbing method based on time sequence tactile data processing. The method comprises the steps that an optimal grabbing area is obtained according to position information of an object; when a mechanical arm runs to the optimal grabbing area, the mechanical arm is controlled to be closed with preset force and stay in a plurality of tactile sensor collection periods, and tactile data, including magnitude and direction information of force applied to the object by the mechanical arm, of the object is collected through a tactile sensor in the staying period; the tactile data of each acquisition period is converted into tactile images, the tactile images arranged according to time are taken as initial network input, cyclic transmission is conducted by using a pre-trained force tracking motion network, and a plurality of frames of tactile images of a future time sequence are predicted; a frame sequence formed by the multiple frames of tactile images is input into a pre-trained LSTM</description><language>chi ; eng</language><subject>CHAMBERS PROVIDED WITH MANIPULATION DEVICES ; HAND TOOLS ; MANIPULATORS ; PERFORMING OPERATIONS ; PORTABLE POWER-DRIVEN TOOLS ; TRANSPORTING</subject><creationdate>2021</creationdate><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20210727&DB=EPODOC&CC=CN&NR=113172629A$$EHTML$$P50$$Gepo$$Hfree_for_read</linktohtml><link.rule.ids>230,308,780,885,25564,76547</link.rule.ids><linktorsrc>$$Uhttps://worldwide.espacenet.com/publicationDetails/biblio?FT=D&date=20210727&DB=EPODOC&CC=CN&NR=113172629A$$EView_record_in_European_Patent_Office$$FView_record_in_$$GEuropean_Patent_Office$$Hfree_for_read</linktorsrc></links><search><creatorcontrib>ZHU XIAOJUN</creatorcontrib><creatorcontrib>LIANG BIN</creatorcontrib><creatorcontrib>ZHOU XINGRU</creatorcontrib><creatorcontrib>LIU HOUDE</creatorcontrib><creatorcontrib>WANG XUEQIAN</creatorcontrib><title>Object grabbing method based on time sequence tactile data processing</title><description>The invention discloses an object grabbing method based on time sequence tactile data processing. The method comprises the steps that an optimal grabbing area is obtained according to position information of an object; when a mechanical arm runs to the optimal grabbing area, the mechanical arm is controlled to be closed with preset force and stay in a plurality of tactile sensor collection periods, and tactile data, including magnitude and direction information of force applied to the object by the mechanical arm, of the object is collected through a tactile sensor in the staying period; the tactile data of each acquisition period is converted into tactile images, the tactile images arranged according to time are taken as initial network input, cyclic transmission is conducted by using a pre-trained force tracking motion network, and a plurality of frames of tactile images of a future time sequence are predicted; a frame sequence formed by the multiple frames of tactile images is input into a pre-trained LSTM</description><subject>CHAMBERS PROVIDED WITH MANIPULATION DEVICES</subject><subject>HAND TOOLS</subject><subject>MANIPULATORS</subject><subject>PERFORMING OPERATIONS</subject><subject>PORTABLE POWER-DRIVEN TOOLS</subject><subject>TRANSPORTING</subject><fulltext>true</fulltext><rsrctype>patent</rsrctype><creationdate>2021</creationdate><recordtype>patent</recordtype><sourceid>EVB</sourceid><recordid>eNqNyzEOwjAQBVE3FAi4w3IACicSiBJFQVTQ0Edr-5MYJbbJLveHggNQTfNmadqbe8Ir9TM7F1NPE3TIgRwLAuVEGieQ4PVG8iBlr3EEBVamMmcPke-1NosHj4LNryuzPbf35rJDyR2ksEeCds3V2toeqn11PNX_mA_R9TMB</recordid><startdate>20210727</startdate><enddate>20210727</enddate><creator>ZHU XIAOJUN</creator><creator>LIANG BIN</creator><creator>ZHOU XINGRU</creator><creator>LIU HOUDE</creator><creator>WANG XUEQIAN</creator><scope>EVB</scope></search><sort><creationdate>20210727</creationdate><title>Object grabbing method based on time sequence tactile data processing</title><author>ZHU XIAOJUN ; LIANG BIN ; ZHOU XINGRU ; LIU HOUDE ; WANG XUEQIAN</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-epo_espacenet_CN113172629A3</frbrgroupid><rsrctype>patents</rsrctype><prefilter>patents</prefilter><language>chi ; eng</language><creationdate>2021</creationdate><topic>CHAMBERS PROVIDED WITH MANIPULATION DEVICES</topic><topic>HAND TOOLS</topic><topic>MANIPULATORS</topic><topic>PERFORMING OPERATIONS</topic><topic>PORTABLE POWER-DRIVEN TOOLS</topic><topic>TRANSPORTING</topic><toplevel>online_resources</toplevel><creatorcontrib>ZHU XIAOJUN</creatorcontrib><creatorcontrib>LIANG BIN</creatorcontrib><creatorcontrib>ZHOU XINGRU</creatorcontrib><creatorcontrib>LIU HOUDE</creatorcontrib><creatorcontrib>WANG XUEQIAN</creatorcontrib><collection>esp@cenet</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>ZHU XIAOJUN</au><au>LIANG BIN</au><au>ZHOU XINGRU</au><au>LIU HOUDE</au><au>WANG XUEQIAN</au><format>patent</format><genre>patent</genre><ristype>GEN</ristype><title>Object grabbing method based on time sequence tactile data processing</title><date>2021-07-27</date><risdate>2021</risdate><abstract>The invention discloses an object grabbing method based on time sequence tactile data processing. The method comprises the steps that an optimal grabbing area is obtained according to position information of an object; when a mechanical arm runs to the optimal grabbing area, the mechanical arm is controlled to be closed with preset force and stay in a plurality of tactile sensor collection periods, and tactile data, including magnitude and direction information of force applied to the object by the mechanical arm, of the object is collected through a tactile sensor in the staying period; the tactile data of each acquisition period is converted into tactile images, the tactile images arranged according to time are taken as initial network input, cyclic transmission is conducted by using a pre-trained force tracking motion network, and a plurality of frames of tactile images of a future time sequence are predicted; a frame sequence formed by the multiple frames of tactile images is input into a pre-trained LSTM</abstract><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | |
ispartof | |
issn | |
language | chi ; eng |
recordid | cdi_epo_espacenet_CN113172629A |
source | esp@cenet |
subjects | CHAMBERS PROVIDED WITH MANIPULATION DEVICES HAND TOOLS MANIPULATORS PERFORMING OPERATIONS PORTABLE POWER-DRIVEN TOOLS TRANSPORTING |
title | Object grabbing method based on time sequence tactile data processing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T04%3A46%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-epo_EVB&rft_val_fmt=info:ofi/fmt:kev:mtx:patent&rft.genre=patent&rft.au=ZHU%20XIAOJUN&rft.date=2021-07-27&rft_id=info:doi/&rft_dat=%3Cepo_EVB%3ECN113172629A%3C/epo_EVB%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |