Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking
Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered. Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dy...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-09 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Wang, Xi Chen, Tianxing Yu, Qiaojun Xu, Tianling Chen, Zanxin Fu, Yiting Lu, Cewu Yao, Mu Luo, Ping |
description | Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered. Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dynamics. To address this limitation, we present a closed-loop pipeline integrating interactive perception with online axis estimation from segmented 3D point clouds. Our method leverages any interactive perception technique as a foundation for interactive perception, inducing slight object movement to generate point cloud frames of the evolving dynamic scene. These point clouds are then segmented using Segment Anything Model 2 (SAM2), after which the moving part of the object is masked for accurate motion online axis estimation, guiding subsequent robotic actions. Our approach significantly enhances the precision and efficiency of manipulation tasks involving articulated objects. Experiments in simulated environments demonstrate that our method outperforms baseline approaches, especially in tasks that demand precise axis-based control. Project Page: https://hytidel.github.io/video-tracking-for-axis-estimation/. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3109527428</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3109527428</sourcerecordid><originalsourceid>FETCH-proquest_journals_31095274283</originalsourceid><addsrcrecordid>eNqNjM0KgkAURocgSMp3GGgtjHc0bWlhtBEX1Vomm-qajTY_1ONn1AO0-uCcwzciHnAeBmkEMCG-MQ1jDBYJxDH3yCHTFmvXCitPtDw2sra0EAr7D8JOUWdQXWipWlSSZi80NDcW71_5RHulu6yAYCXMcLDXor4N_YyMz6I10v_tlMw3-X69DXrdPZw0tmo6p9WgKh6yZQxJBCn_r3oDmkVAUQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3109527428</pqid></control><display><type>article</type><title>Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking</title><source>Free E- Journals</source><creator>Wang, Xi ; Chen, Tianxing ; Yu, Qiaojun ; Xu, Tianling ; Chen, Zanxin ; Fu, Yiting ; Lu, Cewu ; Yao, Mu ; Luo, Ping</creator><creatorcontrib>Wang, Xi ; Chen, Tianxing ; Yu, Qiaojun ; Xu, Tianling ; Chen, Zanxin ; Fu, Yiting ; Lu, Cewu ; Yao, Mu ; Luo, Ping</creatorcontrib><description>Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered. Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dynamics. To address this limitation, we present a closed-loop pipeline integrating interactive perception with online axis estimation from segmented 3D point clouds. Our method leverages any interactive perception technique as a foundation for interactive perception, inducing slight object movement to generate point cloud frames of the evolving dynamic scene. These point clouds are then segmented using Segment Anything Model 2 (SAM2), after which the moving part of the object is masked for accurate motion online axis estimation, guiding subsequent robotic actions. Our approach significantly enhances the precision and efficiency of manipulation tasks involving articulated objects. Experiments in simulated environments demonstrate that our method outperforms baseline approaches, especially in tasks that demand precise axis-based control. Project Page: https://hytidel.github.io/video-tracking-for-axis-estimation/.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Axis movements ; Closed loops ; Interactive control ; Perception ; Three dimensional models ; Three dimensional motion ; Tracking</subject><ispartof>arXiv.org, 2024-09</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>777,781</link.rule.ids></links><search><creatorcontrib>Wang, Xi</creatorcontrib><creatorcontrib>Chen, Tianxing</creatorcontrib><creatorcontrib>Yu, Qiaojun</creatorcontrib><creatorcontrib>Xu, Tianling</creatorcontrib><creatorcontrib>Chen, Zanxin</creatorcontrib><creatorcontrib>Fu, Yiting</creatorcontrib><creatorcontrib>Lu, Cewu</creatorcontrib><creatorcontrib>Yao, Mu</creatorcontrib><creatorcontrib>Luo, Ping</creatorcontrib><title>Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking</title><title>arXiv.org</title><description>Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered. Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dynamics. To address this limitation, we present a closed-loop pipeline integrating interactive perception with online axis estimation from segmented 3D point clouds. Our method leverages any interactive perception technique as a foundation for interactive perception, inducing slight object movement to generate point cloud frames of the evolving dynamic scene. These point clouds are then segmented using Segment Anything Model 2 (SAM2), after which the moving part of the object is masked for accurate motion online axis estimation, guiding subsequent robotic actions. Our approach significantly enhances the precision and efficiency of manipulation tasks involving articulated objects. Experiments in simulated environments demonstrate that our method outperforms baseline approaches, especially in tasks that demand precise axis-based control. Project Page: https://hytidel.github.io/video-tracking-for-axis-estimation/.</description><subject>Axis movements</subject><subject>Closed loops</subject><subject>Interactive control</subject><subject>Perception</subject><subject>Three dimensional models</subject><subject>Three dimensional motion</subject><subject>Tracking</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNjM0KgkAURocgSMp3GGgtjHc0bWlhtBEX1Vomm-qajTY_1ONn1AO0-uCcwzciHnAeBmkEMCG-MQ1jDBYJxDH3yCHTFmvXCitPtDw2sra0EAr7D8JOUWdQXWipWlSSZi80NDcW71_5RHulu6yAYCXMcLDXor4N_YyMz6I10v_tlMw3-X69DXrdPZw0tmo6p9WgKh6yZQxJBCn_r3oDmkVAUQ</recordid><startdate>20240924</startdate><enddate>20240924</enddate><creator>Wang, Xi</creator><creator>Chen, Tianxing</creator><creator>Yu, Qiaojun</creator><creator>Xu, Tianling</creator><creator>Chen, Zanxin</creator><creator>Fu, Yiting</creator><creator>Lu, Cewu</creator><creator>Yao, Mu</creator><creator>Luo, Ping</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240924</creationdate><title>Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking</title><author>Wang, Xi ; Chen, Tianxing ; Yu, Qiaojun ; Xu, Tianling ; Chen, Zanxin ; Fu, Yiting ; Lu, Cewu ; Yao, Mu ; Luo, Ping</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_31095274283</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Axis movements</topic><topic>Closed loops</topic><topic>Interactive control</topic><topic>Perception</topic><topic>Three dimensional models</topic><topic>Three dimensional motion</topic><topic>Tracking</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Xi</creatorcontrib><creatorcontrib>Chen, Tianxing</creatorcontrib><creatorcontrib>Yu, Qiaojun</creatorcontrib><creatorcontrib>Xu, Tianling</creatorcontrib><creatorcontrib>Chen, Zanxin</creatorcontrib><creatorcontrib>Fu, Yiting</creatorcontrib><creatorcontrib>Lu, Cewu</creatorcontrib><creatorcontrib>Yao, Mu</creatorcontrib><creatorcontrib>Luo, Ping</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Xi</au><au>Chen, Tianxing</au><au>Yu, Qiaojun</au><au>Xu, Tianling</au><au>Chen, Zanxin</au><au>Fu, Yiting</au><au>Lu, Cewu</au><au>Yao, Mu</au><au>Luo, Ping</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking</atitle><jtitle>arXiv.org</jtitle><date>2024-09-24</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Articulated object manipulation requires precise object interaction, where the object's axis must be carefully considered. Previous research employed interactive perception for manipulating articulated objects, but typically, open-loop approaches often suffer from overlooking the interaction dynamics. To address this limitation, we present a closed-loop pipeline integrating interactive perception with online axis estimation from segmented 3D point clouds. Our method leverages any interactive perception technique as a foundation for interactive perception, inducing slight object movement to generate point cloud frames of the evolving dynamic scene. These point clouds are then segmented using Segment Anything Model 2 (SAM2), after which the moving part of the object is masked for accurate motion online axis estimation, guiding subsequent robotic actions. Our approach significantly enhances the precision and efficiency of manipulation tasks involving articulated objects. Experiments in simulated environments demonstrate that our method outperforms baseline approaches, especially in tasks that demand precise axis-based control. Project Page: https://hytidel.github.io/video-tracking-for-axis-estimation/.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-09 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_3109527428 |
source | Free E- Journals |
subjects | Axis movements Closed loops Interactive control Perception Three dimensional models Three dimensional motion Tracking |
title | Articulated Object Manipulation using Online Axis Estimation with SAM2-Based Tracking |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T02%3A37%3A01IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Articulated%20Object%20Manipulation%20using%20Online%20Axis%20Estimation%20with%20SAM2-Based%20Tracking&rft.jtitle=arXiv.org&rft.au=Wang,%20Xi&rft.date=2024-09-24&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3109527428%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3109527428&rft_id=info:pmid/&rfr_iscdi=true |