Bringing Together Human and Robotic Environment Representations - A Pilot Study
Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic mod...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 4952 |
---|---|
container_issue | |
container_start_page | 4946 |
container_title | |
container_volume | |
creator | Topp, E.A. Huettenrauch, H. Christensen, H.I. Eklundh, K.S. |
description | Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic models needed for precise localisation and navigation. We assume that this integration can be based on the information potential users give to a service robot about its working environment. We further believe that this information is best given in an interactive setting (a "guided tour") in this particular environment. This paper presents a pilot study that investigates how humans present a familiar environment to a mobile robot. The study is set up within our concept of human augmented mapping, for which we assume an initial "guided tour" scenario to teach a robot its environment. Results from this pilot study are used to validate a proposed generic environment model for a service robot |
doi_str_mv | 10.1109/IROS.2006.282456 |
format | Conference Proceeding |
fullrecord | <record><control><sourceid>swepub_6IE</sourceid><recordid>TN_cdi_ieee_primary_4059204</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>4059204</ieee_id><sourcerecordid>oai_DiVA_org_kth_42063</sourcerecordid><originalsourceid>FETCH-LOGICAL-i314t-efcc6aa3eb5009b5d1680e3d97122dad6e1babdd3fbcd37093662ef75e5449633</originalsourceid><addsrcrecordid>eNqFjM9LAkEcxadfkJj3oMvca21-78zRzFIQDLWuy-zOV53SHdkdC__7lgyhU48H78H78BC6pqRLKTH3o-lk1mWEqC7TTEh1gjom1VQwIQiTRp6iFqOSJ0QrdfZn0_r8uEl9iTp1_U4acSMF1S00eah8uWyM52EJcQUVHu42tsS2dHga8hB9gQflp69CuYEy4ilsK6ibZqMPZY0T3MMvfh0insWd21-hi4Vd19D5zTZ6fRrM-8NkPHke9XvjxHMqYgKLolDWcsglISaXjipNgDuTUsacdQpobnPn-CIvHE-J4UoxWKQSpBBGcd5Gt4ff-gu2uzzbVn5jq30WrM8e_VsvC9Uyq1cZZSJlDX33P_0RV5lg5Of85oB7ADjCgkjDiODf-OVzrg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Bringing Together Human and Robotic Environment Representations - A Pilot Study</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Topp, E.A. ; Huettenrauch, H. ; Christensen, H.I. ; Eklundh, K.S.</creator><creatorcontrib>Topp, E.A. ; Huettenrauch, H. ; Christensen, H.I. ; Eklundh, K.S.</creatorcontrib><description>Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic models needed for precise localisation and navigation. We assume that this integration can be based on the information potential users give to a service robot about its working environment. We further believe that this information is best given in an interactive setting (a "guided tour") in this particular environment. This paper presents a pilot study that investigates how humans present a familiar environment to a mobile robot. The study is set up within our concept of human augmented mapping, for which we assume an initial "guided tour" scenario to teach a robot its environment. Results from this pilot study are used to validate a proposed generic environment model for a service robot</description><identifier>ISSN: 2153-0858</identifier><identifier>ISBN: 9781424402588</identifier><identifier>ISBN: 1424402581</identifier><identifier>EISSN: 2153-0866</identifier><identifier>EISBN: 9781424402595</identifier><identifier>EISBN: 142440259X</identifier><identifier>DOI: 10.1109/IROS.2006.282456</identifier><language>eng</language><publisher>IEEE</publisher><subject>Computer science ; Educational robots ; Human robot interaction ; Intelligent robots ; Mobile robots ; Navigation ; Robot sensing systems ; Service robots ; SIMULTANEOUS LOCALIZATION ; Simultaneous localization and mapping ; Solid modeling</subject><ispartof>2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, p.4946-4952</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/4059204$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>230,309,310,780,784,789,790,885,2058,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/4059204$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-42063$$DView record from Swedish Publication Index$$Hfree_for_read</backlink><backlink>$$Uhttps://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-12472$$DView record from Swedish Publication Index$$Hfree_for_read</backlink></links><search><creatorcontrib>Topp, E.A.</creatorcontrib><creatorcontrib>Huettenrauch, H.</creatorcontrib><creatorcontrib>Christensen, H.I.</creatorcontrib><creatorcontrib>Eklundh, K.S.</creatorcontrib><title>Bringing Together Human and Robotic Environment Representations - A Pilot Study</title><title>2006 IEEE/RSJ International Conference on Intelligent Robots and Systems</title><addtitle>IROS</addtitle><description>Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic models needed for precise localisation and navigation. We assume that this integration can be based on the information potential users give to a service robot about its working environment. We further believe that this information is best given in an interactive setting (a "guided tour") in this particular environment. This paper presents a pilot study that investigates how humans present a familiar environment to a mobile robot. The study is set up within our concept of human augmented mapping, for which we assume an initial "guided tour" scenario to teach a robot its environment. Results from this pilot study are used to validate a proposed generic environment model for a service robot</description><subject>Computer science</subject><subject>Educational robots</subject><subject>Human robot interaction</subject><subject>Intelligent robots</subject><subject>Mobile robots</subject><subject>Navigation</subject><subject>Robot sensing systems</subject><subject>Service robots</subject><subject>SIMULTANEOUS LOCALIZATION</subject><subject>Simultaneous localization and mapping</subject><subject>Solid modeling</subject><issn>2153-0858</issn><issn>2153-0866</issn><isbn>9781424402588</isbn><isbn>1424402581</isbn><isbn>9781424402595</isbn><isbn>142440259X</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2006</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNqFjM9LAkEcxadfkJj3oMvca21-78zRzFIQDLWuy-zOV53SHdkdC__7lgyhU48H78H78BC6pqRLKTH3o-lk1mWEqC7TTEh1gjom1VQwIQiTRp6iFqOSJ0QrdfZn0_r8uEl9iTp1_U4acSMF1S00eah8uWyM52EJcQUVHu42tsS2dHga8hB9gQflp69CuYEy4ilsK6ibZqMPZY0T3MMvfh0insWd21-hi4Vd19D5zTZ6fRrM-8NkPHke9XvjxHMqYgKLolDWcsglISaXjipNgDuTUsacdQpobnPn-CIvHE-J4UoxWKQSpBBGcd5Gt4ff-gu2uzzbVn5jq30WrM8e_VsvC9Uyq1cZZSJlDX33P_0RV5lg5Of85oB7ADjCgkjDiODf-OVzrg</recordid><startdate>20060101</startdate><enddate>20060101</enddate><creator>Topp, E.A.</creator><creator>Huettenrauch, H.</creator><creator>Christensen, H.I.</creator><creator>Eklundh, K.S.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope><scope>ADTPV</scope><scope>BNKNJ</scope><scope>D8V</scope><scope>DF8</scope></search><sort><creationdate>20060101</creationdate><title>Bringing Together Human and Robotic Environment Representations - A Pilot Study</title><author>Topp, E.A. ; Huettenrauch, H. ; Christensen, H.I. ; Eklundh, K.S.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i314t-efcc6aa3eb5009b5d1680e3d97122dad6e1babdd3fbcd37093662ef75e5449633</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Computer science</topic><topic>Educational robots</topic><topic>Human robot interaction</topic><topic>Intelligent robots</topic><topic>Mobile robots</topic><topic>Navigation</topic><topic>Robot sensing systems</topic><topic>Service robots</topic><topic>SIMULTANEOUS LOCALIZATION</topic><topic>Simultaneous localization and mapping</topic><topic>Solid modeling</topic><toplevel>online_resources</toplevel><creatorcontrib>Topp, E.A.</creatorcontrib><creatorcontrib>Huettenrauch, H.</creatorcontrib><creatorcontrib>Christensen, H.I.</creatorcontrib><creatorcontrib>Eklundh, K.S.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection><collection>SwePub</collection><collection>SwePub Conference</collection><collection>SWEPUB Kungliga Tekniska Högskolan</collection><collection>SWEPUB Södertörns högskola- SwePub</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Topp, E.A.</au><au>Huettenrauch, H.</au><au>Christensen, H.I.</au><au>Eklundh, K.S.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Bringing Together Human and Robotic Environment Representations - A Pilot Study</atitle><btitle>2006 IEEE/RSJ International Conference on Intelligent Robots and Systems</btitle><stitle>IROS</stitle><date>2006-01-01</date><risdate>2006</risdate><spage>4946</spage><epage>4952</epage><pages>4946-4952</pages><issn>2153-0858</issn><eissn>2153-0866</eissn><isbn>9781424402588</isbn><isbn>1424402581</isbn><eisbn>9781424402595</eisbn><eisbn>142440259X</eisbn><abstract>Human interaction with a service robot requires a shared representation of the environment for spoken dialogue and task specification where names used for particular locations are depending on personal preferences. A question is how such human oriented models can be tied to the geometric robotic models needed for precise localisation and navigation. We assume that this integration can be based on the information potential users give to a service robot about its working environment. We further believe that this information is best given in an interactive setting (a "guided tour") in this particular environment. This paper presents a pilot study that investigates how humans present a familiar environment to a mobile robot. The study is set up within our concept of human augmented mapping, for which we assume an initial "guided tour" scenario to teach a robot its environment. Results from this pilot study are used to validate a proposed generic environment model for a service robot</abstract><pub>IEEE</pub><doi>10.1109/IROS.2006.282456</doi><tpages>7</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2153-0858 |
ispartof | 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2006, p.4946-4952 |
issn | 2153-0858 2153-0866 |
language | eng |
recordid | cdi_ieee_primary_4059204 |
source | IEEE Electronic Library (IEL) Conference Proceedings |
subjects | Computer science Educational robots Human robot interaction Intelligent robots Mobile robots Navigation Robot sensing systems Service robots SIMULTANEOUS LOCALIZATION Simultaneous localization and mapping Solid modeling |
title | Bringing Together Human and Robotic Environment Representations - A Pilot Study |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T09%3A10%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-swepub_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Bringing%20Together%20Human%20and%20Robotic%20Environment%20Representations%20-%20A%20Pilot%20Study&rft.btitle=2006%20IEEE/RSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems&rft.au=Topp,%20E.A.&rft.date=2006-01-01&rft.spage=4946&rft.epage=4952&rft.pages=4946-4952&rft.issn=2153-0858&rft.eissn=2153-0866&rft.isbn=9781424402588&rft.isbn_list=1424402581&rft_id=info:doi/10.1109/IROS.2006.282456&rft_dat=%3Cswepub_6IE%3Eoai_DiVA_org_kth_42063%3C/swepub_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781424402595&rft.eisbn_list=142440259X&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=4059204&rfr_iscdi=true |