Environment Recognition Based on Analysis of Human Actions for Mobile Robot

In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recogni...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Mitani, M., Takaya, M., Kojima, A., Fukunaga, K.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 786
container_issue
container_start_page 782
container_title
container_volume 4
creator Mitani, M.
Takaya, M.
Kojima, A.
Fukunaga, K.
description In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recognition techniques are also used for searching particular objects. It is, however, difficult in reality to prepare a lot of models in advance for recognizing various objects in unknown environments. On the other hand, human can often recognize objects not from their appearances but by watching other person taking actions on them. This is because the function and/or the usage of the objects are closely related with human actions. We have introduced conceptual models of human actions and objects for classifying objects by observing human activities in our previous work. In this paper, we apply this key idea to a mobile robot. We also demonstrate that the arrangement of objects can be recognized by analyzing human actions
doi_str_mv 10.1109/ICPR.2006.496
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_1699957</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1699957</ieee_id><sourcerecordid>1699957</sourcerecordid><originalsourceid>FETCH-LOGICAL-i175t-d008d74a32d030b44a6fe735bf5750709618b71b812db2a17e1226316d872eae3</originalsourceid><addsrcrecordid>eNotjMtOwzAURC0eEqF0yYqNfyDhXj_jZYkKrSgCRbCu7MZBRkmM4oDUvycIZjNHo6Mh5BqhQARzu61e6oIBqEIYdUIyVnLMtdDylFyCVkYyyRDOSIYgMRdK4gVZpvQBc4SUgpmMPK6H7zDGoffDRGt_iO9DmEIc6J1NvqEzrAbbHVNINLZ089XbeTn8Gom2caRP0YXO0zq6OF2R89Z2yS__e0He7tev1SbfPT9sq9UuD6jllDcAZaOF5awBDk4Iq1qvuXSt1BI0GIWl0-hKZI1jFrVHxhRH1ZSaeev5gtz8_Qbv_f5zDL0dj3tUxhip-Q9fTk1L</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Environment Recognition Based on Analysis of Human Actions for Mobile Robot</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Mitani, M. ; Takaya, M. ; Kojima, A. ; Fukunaga, K.</creator><creatorcontrib>Mitani, M. ; Takaya, M. ; Kojima, A. ; Fukunaga, K.</creatorcontrib><description>In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recognition techniques are also used for searching particular objects. It is, however, difficult in reality to prepare a lot of models in advance for recognizing various objects in unknown environments. On the other hand, human can often recognize objects not from their appearances but by watching other person taking actions on them. This is because the function and/or the usage of the objects are closely related with human actions. We have introduced conceptual models of human actions and objects for classifying objects by observing human activities in our previous work. In this paper, we apply this key idea to a mobile robot. We also demonstrate that the arrangement of objects can be recognized by analyzing human actions</description><identifier>ISSN: 1051-4651</identifier><identifier>ISBN: 0769525210</identifier><identifier>ISBN: 9780769525211</identifier><identifier>EISSN: 2831-7475</identifier><identifier>DOI: 10.1109/ICPR.2006.496</identifier><language>eng</language><publisher>IEEE</publisher><subject>Cameras ; Humans ; Kinetic theory ; Layout ; Mobile robots ; Object recognition ; Path planning ; Pattern recognition ; Robot vision systems ; Shape</subject><ispartof>18th International Conference on Pattern Recognition (ICPR'06), 2006, Vol.4, p.782-786</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1699957$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,776,780,785,786,2052,4036,4037,27902,54895</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1699957$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Mitani, M.</creatorcontrib><creatorcontrib>Takaya, M.</creatorcontrib><creatorcontrib>Kojima, A.</creatorcontrib><creatorcontrib>Fukunaga, K.</creatorcontrib><title>Environment Recognition Based on Analysis of Human Actions for Mobile Robot</title><title>18th International Conference on Pattern Recognition (ICPR'06)</title><addtitle>ICPR</addtitle><description>In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recognition techniques are also used for searching particular objects. It is, however, difficult in reality to prepare a lot of models in advance for recognizing various objects in unknown environments. On the other hand, human can often recognize objects not from their appearances but by watching other person taking actions on them. This is because the function and/or the usage of the objects are closely related with human actions. We have introduced conceptual models of human actions and objects for classifying objects by observing human activities in our previous work. In this paper, we apply this key idea to a mobile robot. We also demonstrate that the arrangement of objects can be recognized by analyzing human actions</description><subject>Cameras</subject><subject>Humans</subject><subject>Kinetic theory</subject><subject>Layout</subject><subject>Mobile robots</subject><subject>Object recognition</subject><subject>Path planning</subject><subject>Pattern recognition</subject><subject>Robot vision systems</subject><subject>Shape</subject><issn>1051-4651</issn><issn>2831-7475</issn><isbn>0769525210</isbn><isbn>9780769525211</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2006</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNotjMtOwzAURC0eEqF0yYqNfyDhXj_jZYkKrSgCRbCu7MZBRkmM4oDUvycIZjNHo6Mh5BqhQARzu61e6oIBqEIYdUIyVnLMtdDylFyCVkYyyRDOSIYgMRdK4gVZpvQBc4SUgpmMPK6H7zDGoffDRGt_iO9DmEIc6J1NvqEzrAbbHVNINLZ089XbeTn8Gom2caRP0YXO0zq6OF2R89Z2yS__e0He7tev1SbfPT9sq9UuD6jllDcAZaOF5awBDk4Iq1qvuXSt1BI0GIWl0-hKZI1jFrVHxhRH1ZSaeev5gtz8_Qbv_f5zDL0dj3tUxhip-Q9fTk1L</recordid><startdate>2006</startdate><enddate>2006</enddate><creator>Mitani, M.</creator><creator>Takaya, M.</creator><creator>Kojima, A.</creator><creator>Fukunaga, K.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>2006</creationdate><title>Environment Recognition Based on Analysis of Human Actions for Mobile Robot</title><author>Mitani, M. ; Takaya, M. ; Kojima, A. ; Fukunaga, K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i175t-d008d74a32d030b44a6fe735bf5750709618b71b812db2a17e1226316d872eae3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2006</creationdate><topic>Cameras</topic><topic>Humans</topic><topic>Kinetic theory</topic><topic>Layout</topic><topic>Mobile robots</topic><topic>Object recognition</topic><topic>Path planning</topic><topic>Pattern recognition</topic><topic>Robot vision systems</topic><topic>Shape</topic><toplevel>online_resources</toplevel><creatorcontrib>Mitani, M.</creatorcontrib><creatorcontrib>Takaya, M.</creatorcontrib><creatorcontrib>Kojima, A.</creatorcontrib><creatorcontrib>Fukunaga, K.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Mitani, M.</au><au>Takaya, M.</au><au>Kojima, A.</au><au>Fukunaga, K.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Environment Recognition Based on Analysis of Human Actions for Mobile Robot</atitle><btitle>18th International Conference on Pattern Recognition (ICPR'06)</btitle><stitle>ICPR</stitle><date>2006</date><risdate>2006</risdate><volume>4</volume><spage>782</spage><epage>786</epage><pages>782-786</pages><issn>1051-4651</issn><eissn>2831-7475</eissn><isbn>0769525210</isbn><isbn>9780769525211</isbn><abstract>In this paper, we propose a novel method for recognizing environment based on relationship between human actions and objects for a mobile robot. Most of previous works on environment recognition for robots focused on generating obstacle maps for path-planning. In addition, model-based object recognition techniques are also used for searching particular objects. It is, however, difficult in reality to prepare a lot of models in advance for recognizing various objects in unknown environments. On the other hand, human can often recognize objects not from their appearances but by watching other person taking actions on them. This is because the function and/or the usage of the objects are closely related with human actions. We have introduced conceptual models of human actions and objects for classifying objects by observing human activities in our previous work. In this paper, we apply this key idea to a mobile robot. We also demonstrate that the arrangement of objects can be recognized by analyzing human actions</abstract><pub>IEEE</pub><doi>10.1109/ICPR.2006.496</doi><tpages>5</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1051-4651
ispartof 18th International Conference on Pattern Recognition (ICPR'06), 2006, Vol.4, p.782-786
issn 1051-4651
2831-7475
language eng
recordid cdi_ieee_primary_1699957
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Cameras
Humans
Kinetic theory
Layout
Mobile robots
Object recognition
Path planning
Pattern recognition
Robot vision systems
Shape
title Environment Recognition Based on Analysis of Human Actions for Mobile Robot
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T03%3A36%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Environment%20Recognition%20Based%20on%20Analysis%20of%20Human%20Actions%20for%20Mobile%20Robot&rft.btitle=18th%20International%20Conference%20on%20Pattern%20Recognition%20(ICPR'06)&rft.au=Mitani,%20M.&rft.date=2006&rft.volume=4&rft.spage=782&rft.epage=786&rft.pages=782-786&rft.issn=1051-4651&rft.eissn=2831-7475&rft.isbn=0769525210&rft.isbn_list=9780769525211&rft_id=info:doi/10.1109/ICPR.2006.496&rft_dat=%3Cieee_6IE%3E1699957%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1699957&rfr_iscdi=true