Artificial Partners to Understand Joint Action: Representing Others to Develop Effective Coordination
In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on neural systems and rehabilitation engineering 2022-01, Vol.30, p.1473-1482 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In the last years, artificial partners have been proposed as tools to study joint action, as they would allow to address joint behaviors in more controlled experimental conditions. Here we present an artificial partner architecture which is capable of integrating all the available information about its human counterpart and to develop efficient and natural forms of coordination. The model uses an extended state observer which combines prior information, motor commands and sensory observations to infer the partner's ongoing actions (partner model). Over trials, these estimates are gradually incorporated into action selection. Using a joint planar task in which the partners are required to perform reaching movements while mechanically coupled, we demonstrate that the artificial partner develops an internal representation of its human counterpart, whose accuracy depends on the degree of mechanical coupling and on the reliability of the sensory information. We also show that human-artificial dyads develop coordination strategies which closely resemble those observed in human-human dyads and can be interpreted as Nash equilibria. The proposed approach may provide insights for the understanding of the mechanisms underlying human-human interaction. Further, it may inform the development of novel neuro-rehabilitative solutions and more efficient human-machine interfaces. |
---|---|
ISSN: | 1534-4320 1558-0210 |
DOI: | 10.1109/TNSRE.2022.3176378 |