Combining human guidance and structured task execution during physical human–robot collaboration
In this work, we consider a scenario in which a human operator physically interacts with a collaborative robot (CoBot) to perform shared and structured tasks. We assume that collaborative operations are formulated as hierarchical task networks to be interactively executed exploiting the human physic...
Gespeichert in:
Veröffentlicht in: | Journal of intelligent manufacturing 2023-10, Vol.34 (7), p.3053-3067 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 3067 |
---|---|
container_issue | 7 |
container_start_page | 3053 |
container_title | Journal of intelligent manufacturing |
container_volume | 34 |
creator | Cacace, Jonathan Caccavale, Riccardo Finzi, Alberto Grieco, Riccardo |
description | In this work, we consider a scenario in which a human operator physically interacts with a collaborative robot (CoBot) to perform shared and structured tasks. We assume that collaborative operations are formulated as hierarchical task networks to be interactively executed exploiting the human physical guidance. In this scenario, the human interventions are continuously interpreted by the robotic system in order to infer whether the human guidance is aligned or not with respect to the planned activities. The interpreted human interventions are also exploited by the robotic system to on-line adapt its cooperative behavior during the execution of the shared plan. Depending on the estimated operator intentions, the robotic system can adjust tasks or motions, while regulating the robot compliance with respect to the co-worker physical guidance. We describe the overall framework illustrating the architecture and its components. The proposed approach is demonstrated in a testing scenario consisting of a human operator that interacts with the Kuka LBR iiwa manipulator in order to perform a collaborative task. The collected results show the effectiveness of the proposed approach. |
doi_str_mv | 10.1007/s10845-022-01989-y |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2843479716</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2843479716</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-bc9f01d4af7e62b5df4c9746e82329b7cd37deb5a2796eb9a6c10e619bbaebfc3</originalsourceid><addsrcrecordid>eNp9kMtKxDAYhYMoOF5ewFXAdTRJm6RZyuANBtzoOuTWmY6dpiYN2J3v4Bv6JE6t4M7VvznfOfwfABcEXxGMxXUiuCoZwpQiTGQl0XgAFoQJiipSskOwwJJxxBhhx-AkpS3GWFacLIBZhp1puqZbw03e6Q6uc-N0Zz3UnYNpiNkOOXoHB51eoX_3Ng9N6KDLcWL6zZgaq9sZ_vr4jMGEAdrQttqEqKfsGTiqdZv8-e89BS93t8_LB7R6un9c3qyQLYgckLGyxsSVuhaeU8NcXVopSu4rWlBphHWFcN4wTYXk3kjNLcGeE2mM9qa2xSm4nHv7GN6yT4Pahhy7_aSiVVmUQgrC9yk6p2wMKUVfqz42Ox1HRbCaXKrZpdq7VD8u1biHihlK_fS2j3_V_1Dfnyt8Fw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2843479716</pqid></control><display><type>article</type><title>Combining human guidance and structured task execution during physical human–robot collaboration</title><source>SpringerLink Journals - AutoHoldings</source><creator>Cacace, Jonathan ; Caccavale, Riccardo ; Finzi, Alberto ; Grieco, Riccardo</creator><creatorcontrib>Cacace, Jonathan ; Caccavale, Riccardo ; Finzi, Alberto ; Grieco, Riccardo</creatorcontrib><description>In this work, we consider a scenario in which a human operator physically interacts with a collaborative robot (CoBot) to perform shared and structured tasks. We assume that collaborative operations are formulated as hierarchical task networks to be interactively executed exploiting the human physical guidance. In this scenario, the human interventions are continuously interpreted by the robotic system in order to infer whether the human guidance is aligned or not with respect to the planned activities. The interpreted human interventions are also exploited by the robotic system to on-line adapt its cooperative behavior during the execution of the shared plan. Depending on the estimated operator intentions, the robotic system can adjust tasks or motions, while regulating the robot compliance with respect to the co-worker physical guidance. We describe the overall framework illustrating the architecture and its components. The proposed approach is demonstrated in a testing scenario consisting of a human operator that interacts with the Kuka LBR iiwa manipulator in order to perform a collaborative task. The collected results show the effectiveness of the proposed approach.</description><identifier>ISSN: 0956-5515</identifier><identifier>EISSN: 1572-8145</identifier><identifier>DOI: 10.1007/s10845-022-01989-y</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Business and Management ; Collaboration ; Control ; Machines ; Manufacturing ; Mechatronics ; Processes ; Production ; Robotics ; Robots</subject><ispartof>Journal of intelligent manufacturing, 2023-10, Vol.34 (7), p.3053-3067</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-bc9f01d4af7e62b5df4c9746e82329b7cd37deb5a2796eb9a6c10e619bbaebfc3</citedby><cites>FETCH-LOGICAL-c319t-bc9f01d4af7e62b5df4c9746e82329b7cd37deb5a2796eb9a6c10e619bbaebfc3</cites><orcidid>0000-0001-6255-6221</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10845-022-01989-y$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10845-022-01989-y$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Cacace, Jonathan</creatorcontrib><creatorcontrib>Caccavale, Riccardo</creatorcontrib><creatorcontrib>Finzi, Alberto</creatorcontrib><creatorcontrib>Grieco, Riccardo</creatorcontrib><title>Combining human guidance and structured task execution during physical human–robot collaboration</title><title>Journal of intelligent manufacturing</title><addtitle>J Intell Manuf</addtitle><description>In this work, we consider a scenario in which a human operator physically interacts with a collaborative robot (CoBot) to perform shared and structured tasks. We assume that collaborative operations are formulated as hierarchical task networks to be interactively executed exploiting the human physical guidance. In this scenario, the human interventions are continuously interpreted by the robotic system in order to infer whether the human guidance is aligned or not with respect to the planned activities. The interpreted human interventions are also exploited by the robotic system to on-line adapt its cooperative behavior during the execution of the shared plan. Depending on the estimated operator intentions, the robotic system can adjust tasks or motions, while regulating the robot compliance with respect to the co-worker physical guidance. We describe the overall framework illustrating the architecture and its components. The proposed approach is demonstrated in a testing scenario consisting of a human operator that interacts with the Kuka LBR iiwa manipulator in order to perform a collaborative task. The collected results show the effectiveness of the proposed approach.</description><subject>Business and Management</subject><subject>Collaboration</subject><subject>Control</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechatronics</subject><subject>Processes</subject><subject>Production</subject><subject>Robotics</subject><subject>Robots</subject><issn>0956-5515</issn><issn>1572-8145</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kMtKxDAYhYMoOF5ewFXAdTRJm6RZyuANBtzoOuTWmY6dpiYN2J3v4Bv6JE6t4M7VvznfOfwfABcEXxGMxXUiuCoZwpQiTGQl0XgAFoQJiipSskOwwJJxxBhhx-AkpS3GWFacLIBZhp1puqZbw03e6Q6uc-N0Zz3UnYNpiNkOOXoHB51eoX_3Ng9N6KDLcWL6zZgaq9sZ_vr4jMGEAdrQttqEqKfsGTiqdZv8-e89BS93t8_LB7R6un9c3qyQLYgckLGyxsSVuhaeU8NcXVopSu4rWlBphHWFcN4wTYXk3kjNLcGeE2mM9qa2xSm4nHv7GN6yT4Pahhy7_aSiVVmUQgrC9yk6p2wMKUVfqz42Ox1HRbCaXKrZpdq7VD8u1biHihlK_fS2j3_V_1Dfnyt8Fw</recordid><startdate>20231001</startdate><enddate>20231001</enddate><creator>Cacace, Jonathan</creator><creator>Caccavale, Riccardo</creator><creator>Finzi, Alberto</creator><creator>Grieco, Riccardo</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7TB</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FJ</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>K9.</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M0S</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-6255-6221</orcidid></search><sort><creationdate>20231001</creationdate><title>Combining human guidance and structured task execution during physical human–robot collaboration</title><author>Cacace, Jonathan ; Caccavale, Riccardo ; Finzi, Alberto ; Grieco, Riccardo</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-bc9f01d4af7e62b5df4c9746e82329b7cd37deb5a2796eb9a6c10e619bbaebfc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Business and Management</topic><topic>Collaboration</topic><topic>Control</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechatronics</topic><topic>Processes</topic><topic>Production</topic><topic>Robotics</topic><topic>Robots</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cacace, Jonathan</creatorcontrib><creatorcontrib>Caccavale, Riccardo</creatorcontrib><creatorcontrib>Finzi, Alberto</creatorcontrib><creatorcontrib>Grieco, Riccardo</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Journal of intelligent manufacturing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cacace, Jonathan</au><au>Caccavale, Riccardo</au><au>Finzi, Alberto</au><au>Grieco, Riccardo</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Combining human guidance and structured task execution during physical human–robot collaboration</atitle><jtitle>Journal of intelligent manufacturing</jtitle><stitle>J Intell Manuf</stitle><date>2023-10-01</date><risdate>2023</risdate><volume>34</volume><issue>7</issue><spage>3053</spage><epage>3067</epage><pages>3053-3067</pages><issn>0956-5515</issn><eissn>1572-8145</eissn><abstract>In this work, we consider a scenario in which a human operator physically interacts with a collaborative robot (CoBot) to perform shared and structured tasks. We assume that collaborative operations are formulated as hierarchical task networks to be interactively executed exploiting the human physical guidance. In this scenario, the human interventions are continuously interpreted by the robotic system in order to infer whether the human guidance is aligned or not with respect to the planned activities. The interpreted human interventions are also exploited by the robotic system to on-line adapt its cooperative behavior during the execution of the shared plan. Depending on the estimated operator intentions, the robotic system can adjust tasks or motions, while regulating the robot compliance with respect to the co-worker physical guidance. We describe the overall framework illustrating the architecture and its components. The proposed approach is demonstrated in a testing scenario consisting of a human operator that interacts with the Kuka LBR iiwa manipulator in order to perform a collaborative task. The collected results show the effectiveness of the proposed approach.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10845-022-01989-y</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0001-6255-6221</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0956-5515 |
ispartof | Journal of intelligent manufacturing, 2023-10, Vol.34 (7), p.3053-3067 |
issn | 0956-5515 1572-8145 |
language | eng |
recordid | cdi_proquest_journals_2843479716 |
source | SpringerLink Journals - AutoHoldings |
subjects | Business and Management Collaboration Control Machines Manufacturing Mechatronics Processes Production Robotics Robots |
title | Combining human guidance and structured task execution during physical human–robot collaboration |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T08%3A47%3A33IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Combining%20human%20guidance%20and%20structured%20task%20execution%20during%20physical%20human%E2%80%93robot%20collaboration&rft.jtitle=Journal%20of%20intelligent%20manufacturing&rft.au=Cacace,%20Jonathan&rft.date=2023-10-01&rft.volume=34&rft.issue=7&rft.spage=3053&rft.epage=3067&rft.pages=3053-3067&rft.issn=0956-5515&rft.eissn=1572-8145&rft_id=info:doi/10.1007/s10845-022-01989-y&rft_dat=%3Cproquest_cross%3E2843479716%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2843479716&rft_id=info:pmid/&rfr_iscdi=true |