A human–robot interface for mobile manipulator

This paper presents a remote manipulation method for mobile manipulator through operator’s gesture. In particular, a track mobile robot is equipped with a 4-DOF robot arm to grasp objects. Operator uses one hand to control both the motion of mobile robot and the posture of robot arm via scheme of ge...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Intelligent service robotics 2018-07, Vol.11 (3), p.269-278
Hauptverfasser: Chen, Mingxuan, Liu, Caibing, Du, Guanglong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 278
container_issue 3
container_start_page 269
container_title Intelligent service robotics
container_volume 11
creator Chen, Mingxuan
Liu, Caibing
Du, Guanglong
description This paper presents a remote manipulation method for mobile manipulator through operator’s gesture. In particular, a track mobile robot is equipped with a 4-DOF robot arm to grasp objects. Operator uses one hand to control both the motion of mobile robot and the posture of robot arm via scheme of gesture polysemy method which is put forward in this paper. A sensor called leap motion (LM), which can obtain the position and posture data of hand, is employed in this system. Two filters were employed to estimate the position and posture of human hand so as to reduce the inherent noise of the sensor. Kalman filter was used to estimate the position, and particle filter was used to estimate the orientation. The advantage of the proposed method is that it is feasible to control a mobile manipulator through just one hand using a LM sensor. The effectiveness of the proposed human–robot interface was verified in laboratory with a series of experiments. And the results indicate that the proposed human–robot interface is able to track the movements of operator’s hand with high accuracy. It is found that the system can be employed by a non-professional operator for robot teleoperation.
doi_str_mv 10.1007/s11370-018-0251-3
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918497382</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918497382</sourcerecordid><originalsourceid>FETCH-LOGICAL-c355t-b93072488e60aa9f2c4b8f4d9cc1cdcc2f68d4489c56a648c8a1b0efd63e7f73</originalsourceid><addsrcrecordid>eNp1kLFOwzAURS0EEqXwAWyRmA3v2Y7tjFUFFKkSS3fLcWxI1cbFTgY2_oE_5EtIFQQT07vDufdJh5BrhFsEUHcZkSuggJoCK5HyEzJDLZEypcXpb1bynFzkvAWQKBifEVgUr8Pedl8fnynWsS_arvcpWOeLEFOxj3W788UItIdhZ_uYLslZsLvsr37unGwe7jfLFV0_Pz4tF2vqeFn2tK44KCa09hKsrQJzotZBNJVz6BrnWJC6EUJXrpRWCu20xRp8aCT3Kig-JzfT7CHFt8Hn3mzjkLrxo2EValEprtlI4US5FHNOPphDavc2vRsEc_RiJi9m9GKOXgwfO2zq5JHtXnz6W_6_9A0CcmXN</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918497382</pqid></control><display><type>article</type><title>A human–robot interface for mobile manipulator</title><source>SpringerLink Journals - AutoHoldings</source><source>ProQuest Central</source><creator>Chen, Mingxuan ; Liu, Caibing ; Du, Guanglong</creator><creatorcontrib>Chen, Mingxuan ; Liu, Caibing ; Du, Guanglong</creatorcontrib><description>This paper presents a remote manipulation method for mobile manipulator through operator’s gesture. In particular, a track mobile robot is equipped with a 4-DOF robot arm to grasp objects. Operator uses one hand to control both the motion of mobile robot and the posture of robot arm via scheme of gesture polysemy method which is put forward in this paper. A sensor called leap motion (LM), which can obtain the position and posture data of hand, is employed in this system. Two filters were employed to estimate the position and posture of human hand so as to reduce the inherent noise of the sensor. Kalman filter was used to estimate the position, and particle filter was used to estimate the orientation. The advantage of the proposed method is that it is feasible to control a mobile manipulator through just one hand using a LM sensor. The effectiveness of the proposed human–robot interface was verified in laboratory with a series of experiments. And the results indicate that the proposed human–robot interface is able to track the movements of operator’s hand with high accuracy. It is found that the system can be employed by a non-professional operator for robot teleoperation.</description><identifier>ISSN: 1861-2776</identifier><identifier>EISSN: 1861-2784</identifier><identifier>DOI: 10.1007/s11370-018-0251-3</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Artificial Intelligence ; Control ; Dynamical Systems ; Engineering ; Grasping (robotics) ; Interfaces ; Kalman filters ; Manipulators ; Mechatronics ; Original Research Paper ; Robot arms ; Robot dynamics ; Robotics ; Robotics and Automation ; Robots ; Sensors ; User Interfaces and Human Computer Interaction ; Vibration ; Vision systems ; Voice recognition ; Wearable computers</subject><ispartof>Intelligent service robotics, 2018-07, Vol.11 (3), p.269-278</ispartof><rights>Springer-Verlag GmbH Germany, part of Springer Nature 2018</rights><rights>Springer-Verlag GmbH Germany, part of Springer Nature 2018.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c355t-b93072488e60aa9f2c4b8f4d9cc1cdcc2f68d4489c56a648c8a1b0efd63e7f73</citedby><cites>FETCH-LOGICAL-c355t-b93072488e60aa9f2c4b8f4d9cc1cdcc2f68d4489c56a648c8a1b0efd63e7f73</cites><orcidid>0000-0001-9425-843X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11370-018-0251-3$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2918497382?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,776,780,21367,27901,27902,33721,41464,42533,43781,51294</link.rule.ids></links><search><creatorcontrib>Chen, Mingxuan</creatorcontrib><creatorcontrib>Liu, Caibing</creatorcontrib><creatorcontrib>Du, Guanglong</creatorcontrib><title>A human–robot interface for mobile manipulator</title><title>Intelligent service robotics</title><addtitle>Intel Serv Robotics</addtitle><description>This paper presents a remote manipulation method for mobile manipulator through operator’s gesture. In particular, a track mobile robot is equipped with a 4-DOF robot arm to grasp objects. Operator uses one hand to control both the motion of mobile robot and the posture of robot arm via scheme of gesture polysemy method which is put forward in this paper. A sensor called leap motion (LM), which can obtain the position and posture data of hand, is employed in this system. Two filters were employed to estimate the position and posture of human hand so as to reduce the inherent noise of the sensor. Kalman filter was used to estimate the position, and particle filter was used to estimate the orientation. The advantage of the proposed method is that it is feasible to control a mobile manipulator through just one hand using a LM sensor. The effectiveness of the proposed human–robot interface was verified in laboratory with a series of experiments. And the results indicate that the proposed human–robot interface is able to track the movements of operator’s hand with high accuracy. It is found that the system can be employed by a non-professional operator for robot teleoperation.</description><subject>Artificial Intelligence</subject><subject>Control</subject><subject>Dynamical Systems</subject><subject>Engineering</subject><subject>Grasping (robotics)</subject><subject>Interfaces</subject><subject>Kalman filters</subject><subject>Manipulators</subject><subject>Mechatronics</subject><subject>Original Research Paper</subject><subject>Robot arms</subject><subject>Robot dynamics</subject><subject>Robotics</subject><subject>Robotics and Automation</subject><subject>Robots</subject><subject>Sensors</subject><subject>User Interfaces and Human Computer Interaction</subject><subject>Vibration</subject><subject>Vision systems</subject><subject>Voice recognition</subject><subject>Wearable computers</subject><issn>1861-2776</issn><issn>1861-2784</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp1kLFOwzAURS0EEqXwAWyRmA3v2Y7tjFUFFKkSS3fLcWxI1cbFTgY2_oE_5EtIFQQT07vDufdJh5BrhFsEUHcZkSuggJoCK5HyEzJDLZEypcXpb1bynFzkvAWQKBifEVgUr8Pedl8fnynWsS_arvcpWOeLEFOxj3W788UItIdhZ_uYLslZsLvsr37unGwe7jfLFV0_Pz4tF2vqeFn2tK44KCa09hKsrQJzotZBNJVz6BrnWJC6EUJXrpRWCu20xRp8aCT3Kig-JzfT7CHFt8Hn3mzjkLrxo2EValEprtlI4US5FHNOPphDavc2vRsEc_RiJi9m9GKOXgwfO2zq5JHtXnz6W_6_9A0CcmXN</recordid><startdate>20180701</startdate><enddate>20180701</enddate><creator>Chen, Mingxuan</creator><creator>Liu, Caibing</creator><creator>Du, Guanglong</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><orcidid>https://orcid.org/0000-0001-9425-843X</orcidid></search><sort><creationdate>20180701</creationdate><title>A human–robot interface for mobile manipulator</title><author>Chen, Mingxuan ; Liu, Caibing ; Du, Guanglong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c355t-b93072488e60aa9f2c4b8f4d9cc1cdcc2f68d4489c56a648c8a1b0efd63e7f73</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Artificial Intelligence</topic><topic>Control</topic><topic>Dynamical Systems</topic><topic>Engineering</topic><topic>Grasping (robotics)</topic><topic>Interfaces</topic><topic>Kalman filters</topic><topic>Manipulators</topic><topic>Mechatronics</topic><topic>Original Research Paper</topic><topic>Robot arms</topic><topic>Robot dynamics</topic><topic>Robotics</topic><topic>Robotics and Automation</topic><topic>Robots</topic><topic>Sensors</topic><topic>User Interfaces and Human Computer Interaction</topic><topic>Vibration</topic><topic>Vision systems</topic><topic>Voice recognition</topic><topic>Wearable computers</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Chen, Mingxuan</creatorcontrib><creatorcontrib>Liu, Caibing</creatorcontrib><creatorcontrib>Du, Guanglong</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><jtitle>Intelligent service robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Chen, Mingxuan</au><au>Liu, Caibing</au><au>Du, Guanglong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A human–robot interface for mobile manipulator</atitle><jtitle>Intelligent service robotics</jtitle><stitle>Intel Serv Robotics</stitle><date>2018-07-01</date><risdate>2018</risdate><volume>11</volume><issue>3</issue><spage>269</spage><epage>278</epage><pages>269-278</pages><issn>1861-2776</issn><eissn>1861-2784</eissn><abstract>This paper presents a remote manipulation method for mobile manipulator through operator’s gesture. In particular, a track mobile robot is equipped with a 4-DOF robot arm to grasp objects. Operator uses one hand to control both the motion of mobile robot and the posture of robot arm via scheme of gesture polysemy method which is put forward in this paper. A sensor called leap motion (LM), which can obtain the position and posture data of hand, is employed in this system. Two filters were employed to estimate the position and posture of human hand so as to reduce the inherent noise of the sensor. Kalman filter was used to estimate the position, and particle filter was used to estimate the orientation. The advantage of the proposed method is that it is feasible to control a mobile manipulator through just one hand using a LM sensor. The effectiveness of the proposed human–robot interface was verified in laboratory with a series of experiments. And the results indicate that the proposed human–robot interface is able to track the movements of operator’s hand with high accuracy. It is found that the system can be employed by a non-professional operator for robot teleoperation.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s11370-018-0251-3</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0001-9425-843X</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1861-2776
ispartof Intelligent service robotics, 2018-07, Vol.11 (3), p.269-278
issn 1861-2776
1861-2784
language eng
recordid cdi_proquest_journals_2918497382
source SpringerLink Journals - AutoHoldings; ProQuest Central
subjects Artificial Intelligence
Control
Dynamical Systems
Engineering
Grasping (robotics)
Interfaces
Kalman filters
Manipulators
Mechatronics
Original Research Paper
Robot arms
Robot dynamics
Robotics
Robotics and Automation
Robots
Sensors
User Interfaces and Human Computer Interaction
Vibration
Vision systems
Voice recognition
Wearable computers
title A human–robot interface for mobile manipulator
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T03%3A40%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20human%E2%80%93robot%20interface%20for%20mobile%20manipulator&rft.jtitle=Intelligent%20service%20robotics&rft.au=Chen,%20Mingxuan&rft.date=2018-07-01&rft.volume=11&rft.issue=3&rft.spage=269&rft.epage=278&rft.pages=269-278&rft.issn=1861-2776&rft.eissn=1861-2784&rft_id=info:doi/10.1007/s11370-018-0251-3&rft_dat=%3Cproquest_cross%3E2918497382%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2918497382&rft_id=info:pmid/&rfr_iscdi=true