Interactive task training of a mobile robot through human gesture recognition
This paper describes a demonstration-based programming system in which a mobile robot observes the actions of a human performing a multi-step task. From these observations, the robot determines which of its pre-learned capabilities are required to replicate the task and in what sequence they must be...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 669 vol.1 |
---|---|
container_issue | |
container_start_page | 664 |
container_title | |
container_volume | 1 |
creator | Rybski, P.E. Voyles, R.M. |
description | This paper describes a demonstration-based programming system in which a mobile robot observes the actions of a human performing a multi-step task. From these observations, the robot determines which of its pre-learned capabilities are required to replicate the task and in what sequence they must be ordered. The focus of this paper is on the hidden Markov model method used to learn and classify the actions as "gestures". A preliminary system demonstration is also described in which the robot observes the human performing a block distribution task. During the demonstration, the robot actively follows the demonstrator to maintain its vantage point and to infer spatial relationships. |
doi_str_mv | 10.1109/ROBOT.1999.770051 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>proquest_6IE</sourceid><recordid>TN_cdi_ieee_primary_770051</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>770051</ieee_id><sourcerecordid>27046801</sourcerecordid><originalsourceid>FETCH-LOGICAL-i118t-b4a10eb2b9f55a159004cbb8474fbb2a167cdf9ce19f2f82e1b23361cebcbf8b3</originalsourceid><addsrcrecordid>eNotkE1LAzEYhIMfYK39AXrKydvW981-JDmq-FGoFKSCtyWJyTa6u6nZrOC_d6HCwBzmYRiGkEuEJSLIm9fN3Wa7RCnlknOAEo_IjJWcZyD4-zFZSC5gUl6igOqEzBBKyArO5Bk5H4ZPAMjzqpqRl1WfbFQm-R9Lkxq-aIrK975vaHBU0S5o31oagw6Jpl0MY7Oju7FTPW3skMY4ZdaEpvfJh_6CnDrVDnbx73Py9viwvX_O1pun1f3tOvOIImW6UAhWMy1dWSosJUBhtBYFL5zWTGHFzYeTxqJ0zAlmUbNpLhqrjXZC53Nyfejdx_A9Tjvqzg_Gtq3qbRiHmnEoKgE4gVcH0Ftr6330nYq_9eGx_A99hl8p</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype><pqid>27046801</pqid></control><display><type>conference_proceeding</type><title>Interactive task training of a mobile robot through human gesture recognition</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Rybski, P.E. ; Voyles, R.M.</creator><creatorcontrib>Rybski, P.E. ; Voyles, R.M.</creatorcontrib><description>This paper describes a demonstration-based programming system in which a mobile robot observes the actions of a human performing a multi-step task. From these observations, the robot determines which of its pre-learned capabilities are required to replicate the task and in what sequence they must be ordered. The focus of this paper is on the hidden Markov model method used to learn and classify the actions as "gestures". A preliminary system demonstration is also described in which the robot observes the human performing a block distribution task. During the demonstration, the robot actively follows the demonstrator to maintain its vantage point and to infer spatial relationships.</description><identifier>ISSN: 1050-4729</identifier><identifier>ISBN: 9780780351806</identifier><identifier>ISBN: 0780351800</identifier><identifier>EISSN: 2577-087X</identifier><identifier>DOI: 10.1109/ROBOT.1999.770051</identifier><language>eng</language><publisher>IEEE</publisher><subject>Computer science ; Hidden Markov models ; Humans ; Mobile robots ; Path planning ; Performance analysis ; Power system modeling ; Robot programming ; Robot vision systems ; Stereo vision</subject><ispartof>Proceedings - IEEE International Conference on Robotics and Automation, 1999, Vol.1, p.664-669 vol.1</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/770051$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,4050,4051,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/770051$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Rybski, P.E.</creatorcontrib><creatorcontrib>Voyles, R.M.</creatorcontrib><title>Interactive task training of a mobile robot through human gesture recognition</title><title>Proceedings - IEEE International Conference on Robotics and Automation</title><addtitle>ROBOT</addtitle><description>This paper describes a demonstration-based programming system in which a mobile robot observes the actions of a human performing a multi-step task. From these observations, the robot determines which of its pre-learned capabilities are required to replicate the task and in what sequence they must be ordered. The focus of this paper is on the hidden Markov model method used to learn and classify the actions as "gestures". A preliminary system demonstration is also described in which the robot observes the human performing a block distribution task. During the demonstration, the robot actively follows the demonstrator to maintain its vantage point and to infer spatial relationships.</description><subject>Computer science</subject><subject>Hidden Markov models</subject><subject>Humans</subject><subject>Mobile robots</subject><subject>Path planning</subject><subject>Performance analysis</subject><subject>Power system modeling</subject><subject>Robot programming</subject><subject>Robot vision systems</subject><subject>Stereo vision</subject><issn>1050-4729</issn><issn>2577-087X</issn><isbn>9780780351806</isbn><isbn>0780351800</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>1999</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotkE1LAzEYhIMfYK39AXrKydvW981-JDmq-FGoFKSCtyWJyTa6u6nZrOC_d6HCwBzmYRiGkEuEJSLIm9fN3Wa7RCnlknOAEo_IjJWcZyD4-zFZSC5gUl6igOqEzBBKyArO5Bk5H4ZPAMjzqpqRl1WfbFQm-R9Lkxq-aIrK975vaHBU0S5o31oagw6Jpl0MY7Oju7FTPW3skMY4ZdaEpvfJh_6CnDrVDnbx73Py9viwvX_O1pun1f3tOvOIImW6UAhWMy1dWSosJUBhtBYFL5zWTGHFzYeTxqJ0zAlmUbNpLhqrjXZC53Nyfejdx_A9Tjvqzg_Gtq3qbRiHmnEoKgE4gVcH0Ftr6330nYq_9eGx_A99hl8p</recordid><startdate>1999</startdate><enddate>1999</enddate><creator>Rybski, P.E.</creator><creator>Voyles, R.M.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>1999</creationdate><title>Interactive task training of a mobile robot through human gesture recognition</title><author>Rybski, P.E. ; Voyles, R.M.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i118t-b4a10eb2b9f55a159004cbb8474fbb2a167cdf9ce19f2f82e1b23361cebcbf8b3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>1999</creationdate><topic>Computer science</topic><topic>Hidden Markov models</topic><topic>Humans</topic><topic>Mobile robots</topic><topic>Path planning</topic><topic>Performance analysis</topic><topic>Power system modeling</topic><topic>Robot programming</topic><topic>Robot vision systems</topic><topic>Stereo vision</topic><toplevel>online_resources</toplevel><creatorcontrib>Rybski, P.E.</creatorcontrib><creatorcontrib>Voyles, R.M.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Rybski, P.E.</au><au>Voyles, R.M.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Interactive task training of a mobile robot through human gesture recognition</atitle><btitle>Proceedings - IEEE International Conference on Robotics and Automation</btitle><stitle>ROBOT</stitle><date>1999</date><risdate>1999</risdate><volume>1</volume><spage>664</spage><epage>669 vol.1</epage><pages>664-669 vol.1</pages><issn>1050-4729</issn><eissn>2577-087X</eissn><isbn>9780780351806</isbn><isbn>0780351800</isbn><abstract>This paper describes a demonstration-based programming system in which a mobile robot observes the actions of a human performing a multi-step task. From these observations, the robot determines which of its pre-learned capabilities are required to replicate the task and in what sequence they must be ordered. The focus of this paper is on the hidden Markov model method used to learn and classify the actions as "gestures". A preliminary system demonstration is also described in which the robot observes the human performing a block distribution task. During the demonstration, the robot actively follows the demonstrator to maintain its vantage point and to infer spatial relationships.</abstract><pub>IEEE</pub><doi>10.1109/ROBOT.1999.770051</doi><tpages>6</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1050-4729 |
ispartof | Proceedings - IEEE International Conference on Robotics and Automation, 1999, Vol.1, p.664-669 vol.1 |
issn | 1050-4729 2577-087X |
language | eng |
recordid | cdi_ieee_primary_770051 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Computer science Hidden Markov models Humans Mobile robots Path planning Performance analysis Power system modeling Robot programming Robot vision systems Stereo vision |
title | Interactive task training of a mobile robot through human gesture recognition |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T17%3A57%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Interactive%20task%20training%20of%20a%20mobile%20robot%20through%20human%20gesture%20recognition&rft.btitle=Proceedings%20-%20IEEE%20International%20Conference%20on%20Robotics%20and%20Automation&rft.au=Rybski,%20P.E.&rft.date=1999&rft.volume=1&rft.spage=664&rft.epage=669%20vol.1&rft.pages=664-669%20vol.1&rft.issn=1050-4729&rft.eissn=2577-087X&rft.isbn=9780780351806&rft.isbn_list=0780351800&rft_id=info:doi/10.1109/ROBOT.1999.770051&rft_dat=%3Cproquest_6IE%3E27046801%3C/proquest_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=27046801&rft_id=info:pmid/&rft_ieee_id=770051&rfr_iscdi=true |