Shared environment representation for a human-robot team performing information fusion
This paper addresses the problem of building a shared environment representation by a human‐robot team. Rich environment models are required in real applications for both autonomous operation of robots and to support human decision‐making. Two probabilistic models are used to describe outdoor enviro...
Gespeichert in:
Veröffentlicht in: | Journal of field robotics 2007-11, Vol.24 (11-12), p.911-942 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 942 |
---|---|
container_issue | 11-12 |
container_start_page | 911 |
container_title | Journal of field robotics |
container_volume | 24 |
creator | Kaupp, Tobias Douillard, Bertrand Ramos, Fabio Makarenko, Alexei Upcroft, Ben |
description | This paper addresses the problem of building a shared environment representation by a human‐robot team. Rich environment models are required in real applications for both autonomous operation of robots and to support human decision‐making. Two probabilistic models are used to describe outdoor environment features such as trees: geometric (position in the world) and visual. The visual representation is used to improve data association and to classify features. Both models are able to incorporate observations from robotic platforms and human operators. Physically, humans and robots form a heterogeneous sensor network. In our experiments, the human‐robot team consists of an unmanned air vehicle, a ground vehicle, and two human operators. They are deployed for an information gathering task and perform information fusion cooperatively. All aspects of the system including the fusion algorithms are fully decentralized. Experimental results are presented in form of the acquired multi‐attribute feature map, information exchange patterns demonstrating human‐robot information fusion, and quantitative model evaluation. Learned lessons from deploying the system in the field are also presented. © 2007 Wiley Periodicals, Inc. |
doi_str_mv | 10.1002/rob.20201 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_30991354</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>30991354</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3381-5631ee77417c6391ded565809245e9075523b32242e71cb50e2fa6f8f58758403</originalsourceid><addsrcrecordid>eNp1kEtPwzAQhC0EEqVw4B_4hMQhrR_ZODlCgYJUUYlHi7hYbrqhgbywE6D_npRAb5xmpP1mpB1CjjkbcMbE0JaLgWCC8R3S4wCB50eB2t16iPbJgXOvjPkyjKBHZvcrY3FJsfhIbVnkWNTUYmXRtc7UaVnQpLTU0FWTm8Jr68ua1mhyWqFtL3lavNC02LhfunGtHJK9xGQOj361Tx6vLh9G195kOr4ZnU28WMqQexBIjqiUz1UcyIgvcQkBhCwSPmDEFICQCymEL1DxeAEMRWKCJEwgVBD6TPbJSddb2fK9QVfrPHUxZpkpsGycliyKuAS_BU87MLalcxYTXdk0N3atOdOb5XT7mv5ZrmWHHfuZZrj-H9R30_O_hNclUlfj1zZh7JsOlFSg57djDRej-dPzTOiJ_AaxmH8O</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>30991354</pqid></control><display><type>article</type><title>Shared environment representation for a human-robot team performing information fusion</title><source>Access via Wiley Online Library</source><creator>Kaupp, Tobias ; Douillard, Bertrand ; Ramos, Fabio ; Makarenko, Alexei ; Upcroft, Ben</creator><creatorcontrib>Kaupp, Tobias ; Douillard, Bertrand ; Ramos, Fabio ; Makarenko, Alexei ; Upcroft, Ben</creatorcontrib><description>This paper addresses the problem of building a shared environment representation by a human‐robot team. Rich environment models are required in real applications for both autonomous operation of robots and to support human decision‐making. Two probabilistic models are used to describe outdoor environment features such as trees: geometric (position in the world) and visual. The visual representation is used to improve data association and to classify features. Both models are able to incorporate observations from robotic platforms and human operators. Physically, humans and robots form a heterogeneous sensor network. In our experiments, the human‐robot team consists of an unmanned air vehicle, a ground vehicle, and two human operators. They are deployed for an information gathering task and perform information fusion cooperatively. All aspects of the system including the fusion algorithms are fully decentralized. Experimental results are presented in form of the acquired multi‐attribute feature map, information exchange patterns demonstrating human‐robot information fusion, and quantitative model evaluation. Learned lessons from deploying the system in the field are also presented. © 2007 Wiley Periodicals, Inc.</description><identifier>ISSN: 1556-4959</identifier><identifier>EISSN: 1556-4967</identifier><identifier>DOI: 10.1002/rob.20201</identifier><language>eng</language><publisher>Hoboken: Wiley Subscription Services, Inc., A Wiley Company</publisher><ispartof>Journal of field robotics, 2007-11, Vol.24 (11-12), p.911-942</ispartof><rights>Copyright © 2007 Wiley Periodicals, Inc., A Wiley Company</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3381-5631ee77417c6391ded565809245e9075523b32242e71cb50e2fa6f8f58758403</citedby><cites>FETCH-LOGICAL-c3381-5631ee77417c6391ded565809245e9075523b32242e71cb50e2fa6f8f58758403</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Frob.20201$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Frob.20201$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Kaupp, Tobias</creatorcontrib><creatorcontrib>Douillard, Bertrand</creatorcontrib><creatorcontrib>Ramos, Fabio</creatorcontrib><creatorcontrib>Makarenko, Alexei</creatorcontrib><creatorcontrib>Upcroft, Ben</creatorcontrib><title>Shared environment representation for a human-robot team performing information fusion</title><title>Journal of field robotics</title><addtitle>J. Field Robotics</addtitle><description>This paper addresses the problem of building a shared environment representation by a human‐robot team. Rich environment models are required in real applications for both autonomous operation of robots and to support human decision‐making. Two probabilistic models are used to describe outdoor environment features such as trees: geometric (position in the world) and visual. The visual representation is used to improve data association and to classify features. Both models are able to incorporate observations from robotic platforms and human operators. Physically, humans and robots form a heterogeneous sensor network. In our experiments, the human‐robot team consists of an unmanned air vehicle, a ground vehicle, and two human operators. They are deployed for an information gathering task and perform information fusion cooperatively. All aspects of the system including the fusion algorithms are fully decentralized. Experimental results are presented in form of the acquired multi‐attribute feature map, information exchange patterns demonstrating human‐robot information fusion, and quantitative model evaluation. Learned lessons from deploying the system in the field are also presented. © 2007 Wiley Periodicals, Inc.</description><issn>1556-4959</issn><issn>1556-4967</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2007</creationdate><recordtype>article</recordtype><recordid>eNp1kEtPwzAQhC0EEqVw4B_4hMQhrR_ZODlCgYJUUYlHi7hYbrqhgbywE6D_npRAb5xmpP1mpB1CjjkbcMbE0JaLgWCC8R3S4wCB50eB2t16iPbJgXOvjPkyjKBHZvcrY3FJsfhIbVnkWNTUYmXRtc7UaVnQpLTU0FWTm8Jr68ua1mhyWqFtL3lavNC02LhfunGtHJK9xGQOj361Tx6vLh9G195kOr4ZnU28WMqQexBIjqiUz1UcyIgvcQkBhCwSPmDEFICQCymEL1DxeAEMRWKCJEwgVBD6TPbJSddb2fK9QVfrPHUxZpkpsGycliyKuAS_BU87MLalcxYTXdk0N3atOdOb5XT7mv5ZrmWHHfuZZrj-H9R30_O_hNclUlfj1zZh7JsOlFSg57djDRej-dPzTOiJ_AaxmH8O</recordid><startdate>200711</startdate><enddate>200711</enddate><creator>Kaupp, Tobias</creator><creator>Douillard, Bertrand</creator><creator>Ramos, Fabio</creator><creator>Makarenko, Alexei</creator><creator>Upcroft, Ben</creator><general>Wiley Subscription Services, Inc., A Wiley Company</general><scope>BSCLL</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>200711</creationdate><title>Shared environment representation for a human-robot team performing information fusion</title><author>Kaupp, Tobias ; Douillard, Bertrand ; Ramos, Fabio ; Makarenko, Alexei ; Upcroft, Ben</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3381-5631ee77417c6391ded565809245e9075523b32242e71cb50e2fa6f8f58758403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2007</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kaupp, Tobias</creatorcontrib><creatorcontrib>Douillard, Bertrand</creatorcontrib><creatorcontrib>Ramos, Fabio</creatorcontrib><creatorcontrib>Makarenko, Alexei</creatorcontrib><creatorcontrib>Upcroft, Ben</creatorcontrib><collection>Istex</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Journal of field robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kaupp, Tobias</au><au>Douillard, Bertrand</au><au>Ramos, Fabio</au><au>Makarenko, Alexei</au><au>Upcroft, Ben</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Shared environment representation for a human-robot team performing information fusion</atitle><jtitle>Journal of field robotics</jtitle><addtitle>J. Field Robotics</addtitle><date>2007-11</date><risdate>2007</risdate><volume>24</volume><issue>11-12</issue><spage>911</spage><epage>942</epage><pages>911-942</pages><issn>1556-4959</issn><eissn>1556-4967</eissn><abstract>This paper addresses the problem of building a shared environment representation by a human‐robot team. Rich environment models are required in real applications for both autonomous operation of robots and to support human decision‐making. Two probabilistic models are used to describe outdoor environment features such as trees: geometric (position in the world) and visual. The visual representation is used to improve data association and to classify features. Both models are able to incorporate observations from robotic platforms and human operators. Physically, humans and robots form a heterogeneous sensor network. In our experiments, the human‐robot team consists of an unmanned air vehicle, a ground vehicle, and two human operators. They are deployed for an information gathering task and perform information fusion cooperatively. All aspects of the system including the fusion algorithms are fully decentralized. Experimental results are presented in form of the acquired multi‐attribute feature map, information exchange patterns demonstrating human‐robot information fusion, and quantitative model evaluation. Learned lessons from deploying the system in the field are also presented. © 2007 Wiley Periodicals, Inc.</abstract><cop>Hoboken</cop><pub>Wiley Subscription Services, Inc., A Wiley Company</pub><doi>10.1002/rob.20201</doi><tpages>32</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1556-4959 |
ispartof | Journal of field robotics, 2007-11, Vol.24 (11-12), p.911-942 |
issn | 1556-4959 1556-4967 |
language | eng |
recordid | cdi_proquest_miscellaneous_30991354 |
source | Access via Wiley Online Library |
title | Shared environment representation for a human-robot team performing information fusion |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T04%3A38%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Shared%20environment%20representation%20for%20a%20human-robot%20team%20performing%20information%20fusion&rft.jtitle=Journal%20of%20field%20robotics&rft.au=Kaupp,%20Tobias&rft.date=2007-11&rft.volume=24&rft.issue=11-12&rft.spage=911&rft.epage=942&rft.pages=911-942&rft.issn=1556-4959&rft.eissn=1556-4967&rft_id=info:doi/10.1002/rob.20201&rft_dat=%3Cproquest_cross%3E30991354%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=30991354&rft_id=info:pmid/&rfr_iscdi=true |