PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments

Robotic manipulation with two-finger grippers is challenged by objects lacking distinct graspable features. Traditional pre-grasping methods, which typically involve repositioning objects or utilizing external aids like table edges, are limited in their adaptability across different object categorie...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-08
Hauptverfasser: Ding, Kairui, Chen, Boyuan, Wu, Ruihai, Li, Yuyang, Zhang, Zongzheng, Huan-ang Gao, Li, Siqi, Zhou, Guyue, Zhu, Yixin, Dong, Hao, Zhao, Hao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Ding, Kairui
Chen, Boyuan
Wu, Ruihai
Li, Yuyang
Zhang, Zongzheng
Huan-ang Gao
Li, Siqi
Zhou, Guyue
Zhu, Yixin
Dong, Hao
Zhao, Hao
description Robotic manipulation with two-finger grippers is challenged by objects lacking distinct graspable features. Traditional pre-grasping methods, which typically involve repositioning objects or utilizing external aids like table edges, are limited in their adaptability across different object categories and environments. To overcome these limitations, we introduce PreAfford, a novel pre-grasping planning framework incorporating a point-level affordance representation and a relay training approach. Our method significantly improves adaptability, allowing effective manipulation across a wide range of environments and object types. When evaluated on the ShapeNet-v2 dataset, PreAfford not only enhances grasping success rates by 69% but also demonstrates its practicality through successful real-world experiments. These improvements highlight PreAfford's potential to redefine standards for robotic handling of complex manipulation tasks in diverse settings.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3033746226</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3033746226</sourcerecordid><originalsourceid>FETCH-proquest_journals_30337462263</originalsourceid><addsrcrecordid>eNqNyksKwjAUheEgCBbtHgKOAzHpQ5z5qDrTgeKwxPZWWupNzW27fou6AEcHzv-NmKe0XohloNSE-USVlFJFsQpD7bHb2cG6KKzLV_yKZQ-OTM2_j8EMxMYQ5HxQ4uAMNSU--ND47kOBn-4VZC1xgzlPsC-dxSdgSzM2LkxN4P92yub75LI9isbZVwfUppXtHA4p1VLrOIiUivR_6g0W-kEy</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3033746226</pqid></control><display><type>article</type><title>PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments</title><source>Free E- Journals</source><creator>Ding, Kairui ; Chen, Boyuan ; Wu, Ruihai ; Li, Yuyang ; Zhang, Zongzheng ; Huan-ang Gao ; Li, Siqi ; Zhou, Guyue ; Zhu, Yixin ; Dong, Hao ; Zhao, Hao</creator><creatorcontrib>Ding, Kairui ; Chen, Boyuan ; Wu, Ruihai ; Li, Yuyang ; Zhang, Zongzheng ; Huan-ang Gao ; Li, Siqi ; Zhou, Guyue ; Zhu, Yixin ; Dong, Hao ; Zhao, Hao</creatorcontrib><description>Robotic manipulation with two-finger grippers is challenged by objects lacking distinct graspable features. Traditional pre-grasping methods, which typically involve repositioning objects or utilizing external aids like table edges, are limited in their adaptability across different object categories and environments. To overcome these limitations, we introduce PreAfford, a novel pre-grasping planning framework incorporating a point-level affordance representation and a relay training approach. Our method significantly improves adaptability, allowing effective manipulation across a wide range of environments and object types. When evaluated on the ShapeNet-v2 dataset, PreAfford not only enhances grasping success rates by 69% but also demonstrates its practicality through successful real-world experiments. These improvements highlight PreAfford's potential to redefine standards for robotic handling of complex manipulation tasks in diverse settings.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Fingers ; Grasping (robotics) ; Grippers ; Robustness (mathematics)</subject><ispartof>arXiv.org, 2024-08</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>781,785</link.rule.ids></links><search><creatorcontrib>Ding, Kairui</creatorcontrib><creatorcontrib>Chen, Boyuan</creatorcontrib><creatorcontrib>Wu, Ruihai</creatorcontrib><creatorcontrib>Li, Yuyang</creatorcontrib><creatorcontrib>Zhang, Zongzheng</creatorcontrib><creatorcontrib>Huan-ang Gao</creatorcontrib><creatorcontrib>Li, Siqi</creatorcontrib><creatorcontrib>Zhou, Guyue</creatorcontrib><creatorcontrib>Zhu, Yixin</creatorcontrib><creatorcontrib>Dong, Hao</creatorcontrib><creatorcontrib>Zhao, Hao</creatorcontrib><title>PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments</title><title>arXiv.org</title><description>Robotic manipulation with two-finger grippers is challenged by objects lacking distinct graspable features. Traditional pre-grasping methods, which typically involve repositioning objects or utilizing external aids like table edges, are limited in their adaptability across different object categories and environments. To overcome these limitations, we introduce PreAfford, a novel pre-grasping planning framework incorporating a point-level affordance representation and a relay training approach. Our method significantly improves adaptability, allowing effective manipulation across a wide range of environments and object types. When evaluated on the ShapeNet-v2 dataset, PreAfford not only enhances grasping success rates by 69% but also demonstrates its practicality through successful real-world experiments. These improvements highlight PreAfford's potential to redefine standards for robotic handling of complex manipulation tasks in diverse settings.</description><subject>Fingers</subject><subject>Grasping (robotics)</subject><subject>Grippers</subject><subject>Robustness (mathematics)</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNyksKwjAUheEgCBbtHgKOAzHpQ5z5qDrTgeKwxPZWWupNzW27fou6AEcHzv-NmKe0XohloNSE-USVlFJFsQpD7bHb2cG6KKzLV_yKZQ-OTM2_j8EMxMYQ5HxQ4uAMNSU--ND47kOBn-4VZC1xgzlPsC-dxSdgSzM2LkxN4P92yub75LI9isbZVwfUppXtHA4p1VLrOIiUivR_6g0W-kEy</recordid><startdate>20240823</startdate><enddate>20240823</enddate><creator>Ding, Kairui</creator><creator>Chen, Boyuan</creator><creator>Wu, Ruihai</creator><creator>Li, Yuyang</creator><creator>Zhang, Zongzheng</creator><creator>Huan-ang Gao</creator><creator>Li, Siqi</creator><creator>Zhou, Guyue</creator><creator>Zhu, Yixin</creator><creator>Dong, Hao</creator><creator>Zhao, Hao</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240823</creationdate><title>PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments</title><author>Ding, Kairui ; Chen, Boyuan ; Wu, Ruihai ; Li, Yuyang ; Zhang, Zongzheng ; Huan-ang Gao ; Li, Siqi ; Zhou, Guyue ; Zhu, Yixin ; Dong, Hao ; Zhao, Hao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30337462263</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Fingers</topic><topic>Grasping (robotics)</topic><topic>Grippers</topic><topic>Robustness (mathematics)</topic><toplevel>online_resources</toplevel><creatorcontrib>Ding, Kairui</creatorcontrib><creatorcontrib>Chen, Boyuan</creatorcontrib><creatorcontrib>Wu, Ruihai</creatorcontrib><creatorcontrib>Li, Yuyang</creatorcontrib><creatorcontrib>Zhang, Zongzheng</creatorcontrib><creatorcontrib>Huan-ang Gao</creatorcontrib><creatorcontrib>Li, Siqi</creatorcontrib><creatorcontrib>Zhou, Guyue</creatorcontrib><creatorcontrib>Zhu, Yixin</creatorcontrib><creatorcontrib>Dong, Hao</creatorcontrib><creatorcontrib>Zhao, Hao</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ding, Kairui</au><au>Chen, Boyuan</au><au>Wu, Ruihai</au><au>Li, Yuyang</au><au>Zhang, Zongzheng</au><au>Huan-ang Gao</au><au>Li, Siqi</au><au>Zhou, Guyue</au><au>Zhu, Yixin</au><au>Dong, Hao</au><au>Zhao, Hao</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments</atitle><jtitle>arXiv.org</jtitle><date>2024-08-23</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Robotic manipulation with two-finger grippers is challenged by objects lacking distinct graspable features. Traditional pre-grasping methods, which typically involve repositioning objects or utilizing external aids like table edges, are limited in their adaptability across different object categories and environments. To overcome these limitations, we introduce PreAfford, a novel pre-grasping planning framework incorporating a point-level affordance representation and a relay training approach. Our method significantly improves adaptability, allowing effective manipulation across a wide range of environments and object types. When evaluated on the ShapeNet-v2 dataset, PreAfford not only enhances grasping success rates by 69% but also demonstrates its practicality through successful real-world experiments. These improvements highlight PreAfford's potential to redefine standards for robotic handling of complex manipulation tasks in diverse settings.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-08
issn 2331-8422
language eng
recordid cdi_proquest_journals_3033746226
source Free E- Journals
subjects Fingers
Grasping (robotics)
Grippers
Robustness (mathematics)
title PreAfford: Universal Affordance-Based Pre-Grasping for Diverse Objects and Environments
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T07%3A53%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=PreAfford:%20Universal%20Affordance-Based%20Pre-Grasping%20for%20Diverse%20Objects%20and%20Environments&rft.jtitle=arXiv.org&rft.au=Ding,%20Kairui&rft.date=2024-08-23&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3033746226%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3033746226&rft_id=info:pmid/&rfr_iscdi=true