A human-centered approach to robot gesture based communication within collaborative working processes

The increasing ability of industrial robots to perform complex tasks in collaboration with humans requires more capable ways of communication and interaction. Traditional systems use separate interfaces such as touchscreens or control panels in order to operate the robot, or to communicate its state...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Ende, T., Haddadin, S., Parusel, S., Wusthoff, T., Hassenzahl, M., Albu-Schaffer, Alin
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 3374
container_issue
container_start_page 3367
container_title
container_volume
creator Ende, T.
Haddadin, S.
Parusel, S.
Wusthoff, T.
Hassenzahl, M.
Albu-Schaffer, Alin
description The increasing ability of industrial robots to perform complex tasks in collaboration with humans requires more capable ways of communication and interaction. Traditional systems use separate interfaces such as touchscreens or control panels in order to operate the robot, or to communicate its state and prospective actions to the user. Transferring human communication, such as gestures to technical non-humanoid robots, creates various opportunities for more intuitive human-robot-interaction. Interaction shall no longer require a separate interface such as a control panel. Instead, it should take place directly between human and robot. To explore intuitive interaction, we identified gestures that are relevant for co-working tasks from human observations. Based on a decomposition approach we transferred them to robotic systems of increasing abstraction and experimentally evaluated how well these gestures are recognized by humans. We created a human-robot interaction use-case in order to perform the task of handling dangerous liquid. Results indicate that several gestures are well perceived when displayed with context information regarding the task.
doi_str_mv 10.1109/IROS.2011.6094592
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6094592</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6094592</ieee_id><sourcerecordid>6094592</sourcerecordid><originalsourceid>FETCH-LOGICAL-c262t-7edca6122e6bf3c1ce4dbe67089bc8504b078d55392e546702f7f7ce71eefe203</originalsourceid><addsrcrecordid>eNo9kNlKAzEYheMG1toHEG_yAlOTTNbLUtygUHC5LknmnzbamZQktfj2Dli9OvB9cDgchG4omVJKzN3zy_J1ygilU0kMF4adoIlRmkrKNOdC6FM0YlTUFdFSnqGrP8HF-b8Q-hJNcv4ghFCijDZyhGCGN_vO9pWHvkCCBtvdLkXrN7hEnKKLBa8hl30C7GwevI9dt--DtyXEHh9C2YR-gNutdTEN8AvwIabP0K_xUOQhZ8jX6KK12wyTY47R-8P92_ypWiwfn-ezReWZZKVS0Hg7DGcgXVt76oE3DqQi2jivBeGOKN0IURsGgg-ctapVHhQFaIGReoxuf3sDAKx2KXQ2fa-Oj9U_NItdeQ</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>A human-centered approach to robot gesture based communication within collaborative working processes</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Ende, T. ; Haddadin, S. ; Parusel, S. ; Wusthoff, T. ; Hassenzahl, M. ; Albu-Schaffer, Alin</creator><creatorcontrib>Ende, T. ; Haddadin, S. ; Parusel, S. ; Wusthoff, T. ; Hassenzahl, M. ; Albu-Schaffer, Alin</creatorcontrib><description>The increasing ability of industrial robots to perform complex tasks in collaboration with humans requires more capable ways of communication and interaction. Traditional systems use separate interfaces such as touchscreens or control panels in order to operate the robot, or to communicate its state and prospective actions to the user. Transferring human communication, such as gestures to technical non-humanoid robots, creates various opportunities for more intuitive human-robot-interaction. Interaction shall no longer require a separate interface such as a control panel. Instead, it should take place directly between human and robot. To explore intuitive interaction, we identified gestures that are relevant for co-working tasks from human observations. Based on a decomposition approach we transferred them to robotic systems of increasing abstraction and experimentally evaluated how well these gestures are recognized by humans. We created a human-robot interaction use-case in order to perform the task of handling dangerous liquid. Results indicate that several gestures are well perceived when displayed with context information regarding the task.</description><identifier>ISSN: 2153-0858</identifier><identifier>ISBN: 1612844545</identifier><identifier>ISBN: 9781612844541</identifier><identifier>EISSN: 2153-0866</identifier><identifier>EISBN: 9781612844558</identifier><identifier>EISBN: 1612844553</identifier><identifier>EISBN: 9781612844565</identifier><identifier>EISBN: 1612844561</identifier><identifier>DOI: 10.1109/IROS.2011.6094592</identifier><language>eng</language><publisher>IEEE</publisher><subject>Assembly ; Humans ; Robot sensing systems ; Safety ; Service robots ; Speech</subject><ispartof>2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, p.3367-3374</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c262t-7edca6122e6bf3c1ce4dbe67089bc8504b078d55392e546702f7f7ce71eefe203</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6094592$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2058,27925,54920</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6094592$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Ende, T.</creatorcontrib><creatorcontrib>Haddadin, S.</creatorcontrib><creatorcontrib>Parusel, S.</creatorcontrib><creatorcontrib>Wusthoff, T.</creatorcontrib><creatorcontrib>Hassenzahl, M.</creatorcontrib><creatorcontrib>Albu-Schaffer, Alin</creatorcontrib><title>A human-centered approach to robot gesture based communication within collaborative working processes</title><title>2011 IEEE/RSJ International Conference on Intelligent Robots and Systems</title><addtitle>IROS</addtitle><description>The increasing ability of industrial robots to perform complex tasks in collaboration with humans requires more capable ways of communication and interaction. Traditional systems use separate interfaces such as touchscreens or control panels in order to operate the robot, or to communicate its state and prospective actions to the user. Transferring human communication, such as gestures to technical non-humanoid robots, creates various opportunities for more intuitive human-robot-interaction. Interaction shall no longer require a separate interface such as a control panel. Instead, it should take place directly between human and robot. To explore intuitive interaction, we identified gestures that are relevant for co-working tasks from human observations. Based on a decomposition approach we transferred them to robotic systems of increasing abstraction and experimentally evaluated how well these gestures are recognized by humans. We created a human-robot interaction use-case in order to perform the task of handling dangerous liquid. Results indicate that several gestures are well perceived when displayed with context information regarding the task.</description><subject>Assembly</subject><subject>Humans</subject><subject>Robot sensing systems</subject><subject>Safety</subject><subject>Service robots</subject><subject>Speech</subject><issn>2153-0858</issn><issn>2153-0866</issn><isbn>1612844545</isbn><isbn>9781612844541</isbn><isbn>9781612844558</isbn><isbn>1612844553</isbn><isbn>9781612844565</isbn><isbn>1612844561</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2011</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo9kNlKAzEYheMG1toHEG_yAlOTTNbLUtygUHC5LknmnzbamZQktfj2Dli9OvB9cDgchG4omVJKzN3zy_J1ygilU0kMF4adoIlRmkrKNOdC6FM0YlTUFdFSnqGrP8HF-b8Q-hJNcv4ghFCijDZyhGCGN_vO9pWHvkCCBtvdLkXrN7hEnKKLBa8hl30C7GwevI9dt--DtyXEHh9C2YR-gNutdTEN8AvwIabP0K_xUOQhZ8jX6KK12wyTY47R-8P92_ypWiwfn-ezReWZZKVS0Hg7DGcgXVt76oE3DqQi2jivBeGOKN0IURsGgg-ctapVHhQFaIGReoxuf3sDAKx2KXQ2fa-Oj9U_NItdeQ</recordid><startdate>201109</startdate><enddate>201109</enddate><creator>Ende, T.</creator><creator>Haddadin, S.</creator><creator>Parusel, S.</creator><creator>Wusthoff, T.</creator><creator>Hassenzahl, M.</creator><creator>Albu-Schaffer, Alin</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>201109</creationdate><title>A human-centered approach to robot gesture based communication within collaborative working processes</title><author>Ende, T. ; Haddadin, S. ; Parusel, S. ; Wusthoff, T. ; Hassenzahl, M. ; Albu-Schaffer, Alin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c262t-7edca6122e6bf3c1ce4dbe67089bc8504b078d55392e546702f7f7ce71eefe203</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2011</creationdate><topic>Assembly</topic><topic>Humans</topic><topic>Robot sensing systems</topic><topic>Safety</topic><topic>Service robots</topic><topic>Speech</topic><toplevel>online_resources</toplevel><creatorcontrib>Ende, T.</creatorcontrib><creatorcontrib>Haddadin, S.</creatorcontrib><creatorcontrib>Parusel, S.</creatorcontrib><creatorcontrib>Wusthoff, T.</creatorcontrib><creatorcontrib>Hassenzahl, M.</creatorcontrib><creatorcontrib>Albu-Schaffer, Alin</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ende, T.</au><au>Haddadin, S.</au><au>Parusel, S.</au><au>Wusthoff, T.</au><au>Hassenzahl, M.</au><au>Albu-Schaffer, Alin</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>A human-centered approach to robot gesture based communication within collaborative working processes</atitle><btitle>2011 IEEE/RSJ International Conference on Intelligent Robots and Systems</btitle><stitle>IROS</stitle><date>2011-09</date><risdate>2011</risdate><spage>3367</spage><epage>3374</epage><pages>3367-3374</pages><issn>2153-0858</issn><eissn>2153-0866</eissn><isbn>1612844545</isbn><isbn>9781612844541</isbn><eisbn>9781612844558</eisbn><eisbn>1612844553</eisbn><eisbn>9781612844565</eisbn><eisbn>1612844561</eisbn><abstract>The increasing ability of industrial robots to perform complex tasks in collaboration with humans requires more capable ways of communication and interaction. Traditional systems use separate interfaces such as touchscreens or control panels in order to operate the robot, or to communicate its state and prospective actions to the user. Transferring human communication, such as gestures to technical non-humanoid robots, creates various opportunities for more intuitive human-robot-interaction. Interaction shall no longer require a separate interface such as a control panel. Instead, it should take place directly between human and robot. To explore intuitive interaction, we identified gestures that are relevant for co-working tasks from human observations. Based on a decomposition approach we transferred them to robotic systems of increasing abstraction and experimentally evaluated how well these gestures are recognized by humans. We created a human-robot interaction use-case in order to perform the task of handling dangerous liquid. Results indicate that several gestures are well perceived when displayed with context information regarding the task.</abstract><pub>IEEE</pub><doi>10.1109/IROS.2011.6094592</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2153-0858
ispartof 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011, p.3367-3374
issn 2153-0858
2153-0866
language eng
recordid cdi_ieee_primary_6094592
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Assembly
Humans
Robot sensing systems
Safety
Service robots
Speech
title A human-centered approach to robot gesture based communication within collaborative working processes
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T20%3A31%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=A%20human-centered%20approach%20to%20robot%20gesture%20based%20communication%20within%20collaborative%20working%20processes&rft.btitle=2011%20IEEE/RSJ%20International%20Conference%20on%20Intelligent%20Robots%20and%20Systems&rft.au=Ende,%20T.&rft.date=2011-09&rft.spage=3367&rft.epage=3374&rft.pages=3367-3374&rft.issn=2153-0858&rft.eissn=2153-0866&rft.isbn=1612844545&rft.isbn_list=9781612844541&rft_id=info:doi/10.1109/IROS.2011.6094592&rft_dat=%3Cieee_6IE%3E6094592%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=9781612844558&rft.eisbn_list=1612844553&rft.eisbn_list=9781612844565&rft.eisbn_list=1612844561&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6094592&rfr_iscdi=true