Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination

In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on neural systems and rehabilitation engineering 2022-01, Vol.30, p.1473-1482
Hauptverfasser: De Vicariis, Cecilia, Pusceddu, Giulia, Chackochan, Vinil T., Sanguineti, Vittorio
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1482
container_issue
container_start_page 1473
container_title IEEE transactions on neural systems and rehabilitation engineering
container_volume 30
creator De Vicariis, Cecilia
Pusceddu, Giulia
Chackochan, Vinil T.
Sanguineti, Vittorio
description In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about its human counterpart and to develop efficient and natural forms of coordination. The model uses an extended state observer which combines prior information, motor commands and sensory observations to infer the partner's ongoing actions (partner model). Over trials, these estimates are gradually incorporated into action selection. Using a joint planar task in which the partners are required to perform reaching movements while mechanically coupled, we demonstrate that the artificial partner develops an internal representation of its human counterpart, whose accuracy depends on the degree of mechanical coupling and on the reliability of the sensory information. We also show that human-artificial dyads develop coordination strategies which closely resemble those observed in human-human dyads and can be interpreted as Nash equilibria. The proposed approach may provide insights for the understanding of the mechanisms underlying human-human interaction. Further, it may inform the development of novel neuro-rehabilitative solutions and more efficient human-machine interfaces.
doi_str_mv 10.1109/TNSRE.2022.3176378
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_2666548924</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9777915</ieee_id><doaj_id>oai_doaj_org_article_9f1516514bfb4dd59d3bef3db6bb7051</doaj_id><sourcerecordid>2672806347</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3068-3b142396b8d169357cba5cf6b94f30d78af98d278a1331d4e0e6397a1f25ad463</originalsourceid><addsrcrecordid>eNpdkU1vEzEQhleIipbCHwAJrcSFywZ_e80tSgMUVRSV9mzZ63FxtLGDvanUf4_TpDlwmpH9zKsZPU3zDqMZxkh9vv35-2Y5I4iQGcVSUNm_aM4w532HCEYvdz1lHaMEnTavS1khVCkuXzWntEIMCXnWwDxPwYchmLH9ZfIUIZd2Su1ddLWbTHTtjxTi1M6HKaT4pb2BTYYCcQrxvr2e_hz4C3iAMW3apfdQyQdoFyllF6LZjb1pTrwZC7w91PPm7uvydvG9u7r-drmYX3UDRaLvqMWMUCVs77BQlMvBGj54YRXzFDnZG696R2rFlGLHAIGgShrsCTeOCXreXO5zXTIrvclhbfKjTibop4eU73W9MQwjaOUxx4JjZr1lznHlqAVPnRXWSsRxzfq0z9rk9HcLZdLrUAYYRxMhbYsmQgjOekVYRT_-h67SNsd6aaUk6ZGgTFaK7Kkhp1Iy-OOCGOmdUP0kVO-E6oPQOvThEL21a3DHkWeDFXi_BwIAHL-VlFJV_f8AYvukEw</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2672806347</pqid></control><display><type>article</type><title>Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination</title><source>DOAJ Directory of Open Access Journals</source><source>Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals</source><creator>De Vicariis, Cecilia ; Pusceddu, Giulia ; Chackochan, Vinil T. ; Sanguineti, Vittorio</creator><creatorcontrib>De Vicariis, Cecilia ; Pusceddu, Giulia ; Chackochan, Vinil T. ; Sanguineti, Vittorio</creatorcontrib><description>In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about its human counterpart and to develop efficient and natural forms of coordination. The model uses an extended state observer which combines prior information, motor commands and sensory observations to infer the partner's ongoing actions (partner model). Over trials, these estimates are gradually incorporated into action selection. Using a joint planar task in which the partners are required to perform reaching movements while mechanically coupled, we demonstrate that the artificial partner develops an internal representation of its human counterpart, whose accuracy depends on the degree of mechanical coupling and on the reliability of the sensory information. We also show that human-artificial dyads develop coordination strategies which closely resemble those observed in human-human dyads and can be interpreted as Nash equilibria. The proposed approach may provide insights for the understanding of the mechanisms underlying human-human interaction. Further, it may inform the development of novel neuro-rehabilitative solutions and more efficient human-machine interfaces.</description><identifier>ISSN: 1534-4320</identifier><identifier>EISSN: 1558-0210</identifier><identifier>DOI: 10.1109/TNSRE.2022.3176378</identifier><identifier>PMID: 35584067</identifier><identifier>CODEN: ITNSB3</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Brain modeling ; Computational modeling ; Computer architecture ; Coordination ; game theory ; Haptic interfaces ; human–robot interaction ; Interfaces ; Joint action ; Man-machine interfaces ; Mechanical properties ; Motor task performance ; Observers ; partner model ; Robot sensing systems ; State observers ; Task analysis</subject><ispartof>IEEE transactions on neural systems and rehabilitation engineering, 2022-01, Vol.30, p.1473-1482</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3068-3b142396b8d169357cba5cf6b94f30d78af98d278a1331d4e0e6397a1f25ad463</citedby><cites>FETCH-LOGICAL-c3068-3b142396b8d169357cba5cf6b94f30d78af98d278a1331d4e0e6397a1f25ad463</cites><orcidid>0000-0001-7200-0892 ; 0000-0002-6643-0009 ; 0000-0002-5426-5448 ; 0000-0001-8746-3136</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,864,2102,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35584067$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>De Vicariis, Cecilia</creatorcontrib><creatorcontrib>Pusceddu, Giulia</creatorcontrib><creatorcontrib>Chackochan, Vinil T.</creatorcontrib><creatorcontrib>Sanguineti, Vittorio</creatorcontrib><title>Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination</title><title>IEEE transactions on neural systems and rehabilitation engineering</title><addtitle>TNSRE</addtitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><description>In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about its human counterpart and to develop efficient and natural forms of coordination. The model uses an extended state observer which combines prior information, motor commands and sensory observations to infer the partner's ongoing actions (partner model). Over trials, these estimates are gradually incorporated into action selection. Using a joint planar task in which the partners are required to perform reaching movements while mechanically coupled, we demonstrate that the artificial partner develops an internal representation of its human counterpart, whose accuracy depends on the degree of mechanical coupling and on the reliability of the sensory information. We also show that human-artificial dyads develop coordination strategies which closely resemble those observed in human-human dyads and can be interpreted as Nash equilibria. The proposed approach may provide insights for the understanding of the mechanisms underlying human-human interaction. Further, it may inform the development of novel neuro-rehabilitative solutions and more efficient human-machine interfaces.</description><subject>Brain modeling</subject><subject>Computational modeling</subject><subject>Computer architecture</subject><subject>Coordination</subject><subject>game theory</subject><subject>Haptic interfaces</subject><subject>human–robot interaction</subject><subject>Interfaces</subject><subject>Joint action</subject><subject>Man-machine interfaces</subject><subject>Mechanical properties</subject><subject>Motor task performance</subject><subject>Observers</subject><subject>partner model</subject><subject>Robot sensing systems</subject><subject>State observers</subject><subject>Task analysis</subject><issn>1534-4320</issn><issn>1558-0210</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>RIE</sourceid><sourceid>DOA</sourceid><recordid>eNpdkU1vEzEQhleIipbCHwAJrcSFywZ_e80tSgMUVRSV9mzZ63FxtLGDvanUf4_TpDlwmpH9zKsZPU3zDqMZxkh9vv35-2Y5I4iQGcVSUNm_aM4w532HCEYvdz1lHaMEnTavS1khVCkuXzWntEIMCXnWwDxPwYchmLH9ZfIUIZd2Su1ddLWbTHTtjxTi1M6HKaT4pb2BTYYCcQrxvr2e_hz4C3iAMW3apfdQyQdoFyllF6LZjb1pTrwZC7w91PPm7uvydvG9u7r-drmYX3UDRaLvqMWMUCVs77BQlMvBGj54YRXzFDnZG696R2rFlGLHAIGgShrsCTeOCXreXO5zXTIrvclhbfKjTibop4eU73W9MQwjaOUxx4JjZr1lznHlqAVPnRXWSsRxzfq0z9rk9HcLZdLrUAYYRxMhbYsmQgjOekVYRT_-h67SNsd6aaUk6ZGgTFaK7Kkhp1Iy-OOCGOmdUP0kVO-E6oPQOvThEL21a3DHkWeDFXi_BwIAHL-VlFJV_f8AYvukEw</recordid><startdate>20220101</startdate><enddate>20220101</enddate><creator>De Vicariis, Cecilia</creator><creator>Pusceddu, Giulia</creator><creator>Chackochan, Vinil T.</creator><creator>Sanguineti, Vittorio</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-7200-0892</orcidid><orcidid>https://orcid.org/0000-0002-6643-0009</orcidid><orcidid>https://orcid.org/0000-0002-5426-5448</orcidid><orcidid>https://orcid.org/0000-0001-8746-3136</orcidid></search><sort><creationdate>20220101</creationdate><title>Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination</title><author>De Vicariis, Cecilia ; Pusceddu, Giulia ; Chackochan, Vinil T. ; Sanguineti, Vittorio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3068-3b142396b8d169357cba5cf6b94f30d78af98d278a1331d4e0e6397a1f25ad463</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Brain modeling</topic><topic>Computational modeling</topic><topic>Computer architecture</topic><topic>Coordination</topic><topic>game theory</topic><topic>Haptic interfaces</topic><topic>human–robot interaction</topic><topic>Interfaces</topic><topic>Joint action</topic><topic>Man-machine interfaces</topic><topic>Mechanical properties</topic><topic>Motor task performance</topic><topic>Observers</topic><topic>partner model</topic><topic>Robot sensing systems</topic><topic>State observers</topic><topic>Task analysis</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>De Vicariis, Cecilia</creatorcontrib><creatorcontrib>Pusceddu, Giulia</creatorcontrib><creatorcontrib>Chackochan, Vinil T.</creatorcontrib><creatorcontrib>Sanguineti, Vittorio</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>De Vicariis, Cecilia</au><au>Pusceddu, Giulia</au><au>Chackochan, Vinil T.</au><au>Sanguineti, Vittorio</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination</atitle><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle><stitle>TNSRE</stitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><date>2022-01-01</date><risdate>2022</risdate><volume>30</volume><spage>1473</spage><epage>1482</epage><pages>1473-1482</pages><issn>1534-4320</issn><eissn>1558-0210</eissn><coden>ITNSB3</coden><abstract>In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about its human counterpart and to develop efficient and natural forms of coordination. The model uses an extended state observer which combines prior information, motor commands and sensory observations to infer the partner's ongoing actions (partner model). Over trials, these estimates are gradually incorporated into action selection. Using a joint planar task in which the partners are required to perform reaching movements while mechanically coupled, we demonstrate that the artificial partner develops an internal representation of its human counterpart, whose accuracy depends on the degree of mechanical coupling and on the reliability of the sensory information. We also show that human-artificial dyads develop coordination strategies which closely resemble those observed in human-human dyads and can be interpreted as Nash equilibria. The proposed approach may provide insights for the understanding of the mechanisms underlying human-human interaction. Further, it may inform the development of novel neuro-rehabilitative solutions and more efficient human-machine interfaces.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>35584067</pmid><doi>10.1109/TNSRE.2022.3176378</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0001-7200-0892</orcidid><orcidid>https://orcid.org/0000-0002-6643-0009</orcidid><orcidid>https://orcid.org/0000-0002-5426-5448</orcidid><orcidid>https://orcid.org/0000-0001-8746-3136</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1534-4320
ispartof IEEE transactions on neural systems and rehabilitation engineering, 2022-01, Vol.30, p.1473-1482
issn 1534-4320
1558-0210
language eng
recordid cdi_proquest_miscellaneous_2666548924
source DOAJ Directory of Open Access Journals; Elektronische Zeitschriftenbibliothek - Frei zugängliche E-Journals
subjects Brain modeling
Computational modeling
Computer architecture
Coordination
game theory
Haptic interfaces
human–robot interaction
Interfaces
Joint action
Man-machine interfaces
Mechanical properties
Motor task performance
Observers
partner model
Robot sensing systems
State observers
Task analysis
title Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T21%3A56%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Artificial%20Partners%20to%20Understand%20Joint%20Action:%20Representing%20Others%20to%20Develop%20Effective%20Coordination&rft.jtitle=IEEE%20transactions%20on%20neural%20systems%20and%20rehabilitation%20engineering&rft.au=De%20Vicariis,%20Cecilia&rft.date=2022-01-01&rft.volume=30&rft.spage=1473&rft.epage=1482&rft.pages=1473-1482&rft.issn=1534-4320&rft.eissn=1558-0210&rft.coden=ITNSB3&rft_id=info:doi/10.1109/TNSRE.2022.3176378&rft_dat=%3Cproquest_pubme%3E2672806347%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2672806347&rft_id=info:pmid/35584067&rft_ieee_id=9777915&rft_doaj_id=oai_doaj_org_article_9f1516514bfb4dd59d3bef3db6bb7051&rfr_iscdi=true