Situated Live Programming for Human-Robot Collaboration

We present situated live programming for human-robot collaboration, an approach that enables users with limited programming experience to program collaborative applications for human-robot interaction. Allowing end users, such as shop floor workers, to program collaborative robots themselves would m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-08
Hauptverfasser: Senft, Emmanuel, Hagenow, Michael, Radwin, Robert, Zinn, Michael, Gleicher, Michael, Mutlu, Bilge
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Senft, Emmanuel
Hagenow, Michael
Radwin, Robert
Zinn, Michael
Gleicher, Michael
Mutlu, Bilge
description We present situated live programming for human-robot collaboration, an approach that enables users with limited programming experience to program collaborative applications for human-robot interaction. Allowing end users, such as shop floor workers, to program collaborative robots themselves would make it easy to "retask" robots from one process to another, facilitating their adoption by small and medium enterprises. Our approach builds on the paradigm of trigger-action programming (TAP) by allowing end users to create rich interactions through simple trigger-action pairings. It enables end users to iteratively create, edit, and refine a reactive robot program while executing partial programs. This live programming approach enables the user to utilize the task space and objects by incrementally specifying situated trigger-action pairs, substantially lowering the barrier to entry for programming or reprogramming robots for collaboration. We instantiate situated live programming in an authoring system where users can create trigger-action programs by annotating an augmented video feed from the robot's perspective and assign robot actions to trigger conditions. We evaluated this system in a study where participants (n = 10) developed robot programs for solving collaborative light-manufacturing tasks. Results showed that users with little programming experience were able to program HRC tasks in an interactive fashion and our situated live programming approach further supported individualized strategies and workflows. We conclude by discussing opportunities and limitations of the proposed approach, our system implementation, and our study and discuss a roadmap for expanding this approach to a broader range of tasks and applications.
doi_str_mv 10.48550/arxiv.2108.03592
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2108_03592</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2559943800</sourcerecordid><originalsourceid>FETCH-LOGICAL-a520-6862e335f328f3ec0a6a037c4c2f67a5f5823c8698ec5c120af7d7c8d75b6d2c3</originalsourceid><addsrcrecordid>eNotz81Kw0AUBeBBECy1D-DKgOvEm3szP1lKUVsoKNp9uJ3MlJQmUydJ0be3tq7O5nA4nxB3OWSFkRIeOX43xwxzMBmQLPFKTJAoT02BeCNmfb8DAFQapaSJ0J_NMPLg6mTVHF3yHsM2cts23TbxISaLseUu_QibMCTzsN_zJkQemtDdimvP-97N_nMq1i_P6_kiXb29LudPq5QlQqqMQkckPaHx5CywYiBtC4teaZZeGiRrVGmclTZHYK9rbU2t5UbVaGkq7i-zZ1V1iE3L8af601Vn3anxcGkcYvgaXT9UuzDG7vSpOgnLsiADQL9bb1Fb</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2559943800</pqid></control><display><type>article</type><title>Situated Live Programming for Human-Robot Collaboration</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Senft, Emmanuel ; Hagenow, Michael ; Radwin, Robert ; Zinn, Michael ; Gleicher, Michael ; Mutlu, Bilge</creator><creatorcontrib>Senft, Emmanuel ; Hagenow, Michael ; Radwin, Robert ; Zinn, Michael ; Gleicher, Michael ; Mutlu, Bilge</creatorcontrib><description>We present situated live programming for human-robot collaboration, an approach that enables users with limited programming experience to program collaborative applications for human-robot interaction. Allowing end users, such as shop floor workers, to program collaborative robots themselves would make it easy to "retask" robots from one process to another, facilitating their adoption by small and medium enterprises. Our approach builds on the paradigm of trigger-action programming (TAP) by allowing end users to create rich interactions through simple trigger-action pairings. It enables end users to iteratively create, edit, and refine a reactive robot program while executing partial programs. This live programming approach enables the user to utilize the task space and objects by incrementally specifying situated trigger-action pairs, substantially lowering the barrier to entry for programming or reprogramming robots for collaboration. We instantiate situated live programming in an authoring system where users can create trigger-action programs by annotating an augmented video feed from the robot's perspective and assign robot actions to trigger conditions. We evaluated this system in a study where participants (n = 10) developed robot programs for solving collaborative light-manufacturing tasks. Results showed that users with little programming experience were able to program HRC tasks in an interactive fashion and our situated live programming approach further supported individualized strategies and workflows. We conclude by discussing opportunities and limitations of the proposed approach, our system implementation, and our study and discuss a roadmap for expanding this approach to a broader range of tasks and applications.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2108.03592</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Applications programs ; Collaboration ; Computer Science - Human-Computer Interaction ; Computer Science - Robotics ; End users ; Human engineering ; Programming ; Robots ; Small &amp; medium sized enterprises-SME ; Small business ; Task space</subject><ispartof>arXiv.org, 2021-08</ispartof><rights>2021. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27925</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.2108.03592$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.1145/3472749.3474773$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Senft, Emmanuel</creatorcontrib><creatorcontrib>Hagenow, Michael</creatorcontrib><creatorcontrib>Radwin, Robert</creatorcontrib><creatorcontrib>Zinn, Michael</creatorcontrib><creatorcontrib>Gleicher, Michael</creatorcontrib><creatorcontrib>Mutlu, Bilge</creatorcontrib><title>Situated Live Programming for Human-Robot Collaboration</title><title>arXiv.org</title><description>We present situated live programming for human-robot collaboration, an approach that enables users with limited programming experience to program collaborative applications for human-robot interaction. Allowing end users, such as shop floor workers, to program collaborative robots themselves would make it easy to "retask" robots from one process to another, facilitating their adoption by small and medium enterprises. Our approach builds on the paradigm of trigger-action programming (TAP) by allowing end users to create rich interactions through simple trigger-action pairings. It enables end users to iteratively create, edit, and refine a reactive robot program while executing partial programs. This live programming approach enables the user to utilize the task space and objects by incrementally specifying situated trigger-action pairs, substantially lowering the barrier to entry for programming or reprogramming robots for collaboration. We instantiate situated live programming in an authoring system where users can create trigger-action programs by annotating an augmented video feed from the robot's perspective and assign robot actions to trigger conditions. We evaluated this system in a study where participants (n = 10) developed robot programs for solving collaborative light-manufacturing tasks. Results showed that users with little programming experience were able to program HRC tasks in an interactive fashion and our situated live programming approach further supported individualized strategies and workflows. We conclude by discussing opportunities and limitations of the proposed approach, our system implementation, and our study and discuss a roadmap for expanding this approach to a broader range of tasks and applications.</description><subject>Applications programs</subject><subject>Collaboration</subject><subject>Computer Science - Human-Computer Interaction</subject><subject>Computer Science - Robotics</subject><subject>End users</subject><subject>Human engineering</subject><subject>Programming</subject><subject>Robots</subject><subject>Small &amp; medium sized enterprises-SME</subject><subject>Small business</subject><subject>Task space</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotz81Kw0AUBeBBECy1D-DKgOvEm3szP1lKUVsoKNp9uJ3MlJQmUydJ0be3tq7O5nA4nxB3OWSFkRIeOX43xwxzMBmQLPFKTJAoT02BeCNmfb8DAFQapaSJ0J_NMPLg6mTVHF3yHsM2cts23TbxISaLseUu_QibMCTzsN_zJkQemtDdimvP-97N_nMq1i_P6_kiXb29LudPq5QlQqqMQkckPaHx5CywYiBtC4teaZZeGiRrVGmclTZHYK9rbU2t5UbVaGkq7i-zZ1V1iE3L8af601Vn3anxcGkcYvgaXT9UuzDG7vSpOgnLsiADQL9bb1Fb</recordid><startdate>20210808</startdate><enddate>20210808</enddate><creator>Senft, Emmanuel</creator><creator>Hagenow, Michael</creator><creator>Radwin, Robert</creator><creator>Zinn, Michael</creator><creator>Gleicher, Michael</creator><creator>Mutlu, Bilge</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210808</creationdate><title>Situated Live Programming for Human-Robot Collaboration</title><author>Senft, Emmanuel ; Hagenow, Michael ; Radwin, Robert ; Zinn, Michael ; Gleicher, Michael ; Mutlu, Bilge</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a520-6862e335f328f3ec0a6a037c4c2f67a5f5823c8698ec5c120af7d7c8d75b6d2c3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Applications programs</topic><topic>Collaboration</topic><topic>Computer Science - Human-Computer Interaction</topic><topic>Computer Science - Robotics</topic><topic>End users</topic><topic>Human engineering</topic><topic>Programming</topic><topic>Robots</topic><topic>Small &amp; medium sized enterprises-SME</topic><topic>Small business</topic><topic>Task space</topic><toplevel>online_resources</toplevel><creatorcontrib>Senft, Emmanuel</creatorcontrib><creatorcontrib>Hagenow, Michael</creatorcontrib><creatorcontrib>Radwin, Robert</creatorcontrib><creatorcontrib>Zinn, Michael</creatorcontrib><creatorcontrib>Gleicher, Michael</creatorcontrib><creatorcontrib>Mutlu, Bilge</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Senft, Emmanuel</au><au>Hagenow, Michael</au><au>Radwin, Robert</au><au>Zinn, Michael</au><au>Gleicher, Michael</au><au>Mutlu, Bilge</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Situated Live Programming for Human-Robot Collaboration</atitle><jtitle>arXiv.org</jtitle><date>2021-08-08</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>We present situated live programming for human-robot collaboration, an approach that enables users with limited programming experience to program collaborative applications for human-robot interaction. Allowing end users, such as shop floor workers, to program collaborative robots themselves would make it easy to "retask" robots from one process to another, facilitating their adoption by small and medium enterprises. Our approach builds on the paradigm of trigger-action programming (TAP) by allowing end users to create rich interactions through simple trigger-action pairings. It enables end users to iteratively create, edit, and refine a reactive robot program while executing partial programs. This live programming approach enables the user to utilize the task space and objects by incrementally specifying situated trigger-action pairs, substantially lowering the barrier to entry for programming or reprogramming robots for collaboration. We instantiate situated live programming in an authoring system where users can create trigger-action programs by annotating an augmented video feed from the robot's perspective and assign robot actions to trigger conditions. We evaluated this system in a study where participants (n = 10) developed robot programs for solving collaborative light-manufacturing tasks. Results showed that users with little programming experience were able to program HRC tasks in an interactive fashion and our situated live programming approach further supported individualized strategies and workflows. We conclude by discussing opportunities and limitations of the proposed approach, our system implementation, and our study and discuss a roadmap for expanding this approach to a broader range of tasks and applications.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2108.03592</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2021-08
issn 2331-8422
language eng
recordid cdi_arxiv_primary_2108_03592
source arXiv.org; Free E- Journals
subjects Applications programs
Collaboration
Computer Science - Human-Computer Interaction
Computer Science - Robotics
End users
Human engineering
Programming
Robots
Small & medium sized enterprises-SME
Small business
Task space
title Situated Live Programming for Human-Robot Collaboration
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T05%3A57%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Situated%20Live%20Programming%20for%20Human-Robot%20Collaboration&rft.jtitle=arXiv.org&rft.au=Senft,%20Emmanuel&rft.date=2021-08-08&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2108.03592&rft_dat=%3Cproquest_arxiv%3E2559943800%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2559943800&rft_id=info:pmid/&rfr_iscdi=true