Conveying Emotions Through Device-Initiated Touch

Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of devi...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on affective computing 2022-07, Vol.13 (3), p.1477-1488
Hauptverfasser: Teyssier, Marc, Bailly, Gilles, Pelachaud, Catherine, Lecolinet, Eric
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1488
container_issue 3
container_start_page 1477
container_title IEEE transactions on affective computing
container_volume 13
creator Teyssier, Marc
Bailly, Gilles
Pelachaud, Catherine
Lecolinet, Eric
description Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of device-initiated touch for conveying emotions with an interactive system reproducing a collection of human touch characteristics. For this purpose, we control a robotic arm to touch the forearm of participants with different force, velocity and amplitude characteristics to simulate human touch. In view of adding touch as an emotional modality in human-machine interaction, we have conducted two studies. After designing the touch device, we explore touch in a context-free setup and then in a controlled context defined by textual scenarios and emotional facial expressions of a virtual agent. Our results suggest that certain combinations of touch characteristics are associated with the perception of different degrees of valence and of arousal. Moreover, in the case of non-congruent mixed signals (touch, facial expression, textual scenario) not conveying a priori the same emotion, the message conveyed by touch seems to prevail over the ones displayed by the visual and textual signals.
doi_str_mv 10.1109/TAFFC.2020.3008693
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_hal_primary_oai_HAL_hal_02949047v1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9139211</ieee_id><sourcerecordid>2709156568</sourcerecordid><originalsourceid>FETCH-LOGICAL-c439t-41c8fa4cafd915dde963b285d3bb8bd33dae13ff2337f598d027850c8764fe603</originalsourceid><addsrcrecordid>eNpNkE1rAjEQhkNpoWL9A-1loace1k4y-5EcxWoVhF6255DdJG5EN3Y_BP99165I5zLD8L7z8RDyTGFKKYj3bLZczqcMGEwRgCcC78iIikiECFF8_69-JJOm2UEfiJiwdETo3Fcnc3bVNlgcfOt81QRZWftuWwYf5uQKE64r1zrVGh1kvivKJ_Jg1b4xk2sek-_lIpuvws3X53o-24RFhKINI1pwq6JCWS1orLURCeaMxxrznOcaUStD0VqGmNpYcA0s5TEUPE0iaxLAMXkb5pZqL4-1O6j6LL1ycjXbyEsPWP8WROmJ9trXQXus_U9nmlbufFdX_XmSpdDvT-KE9yo2qIraN01t7G0sBXkhKf9IygtJeSXZm14GkzPG3AyComCU4i9SaWz6</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2709156568</pqid></control><display><type>article</type><title>Conveying Emotions Through Device-Initiated Touch</title><source>IEEE Electronic Library (IEL)</source><creator>Teyssier, Marc ; Bailly, Gilles ; Pelachaud, Catherine ; Lecolinet, Eric</creator><creatorcontrib>Teyssier, Marc ; Bailly, Gilles ; Pelachaud, Catherine ; Lecolinet, Eric</creatorcontrib><description>Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of device-initiated touch for conveying emotions with an interactive system reproducing a collection of human touch characteristics. For this purpose, we control a robotic arm to touch the forearm of participants with different force, velocity and amplitude characteristics to simulate human touch. In view of adding touch as an emotional modality in human-machine interaction, we have conducted two studies. After designing the touch device, we explore touch in a context-free setup and then in a controlled context defined by textual scenarios and emotional facial expressions of a virtual agent. Our results suggest that certain combinations of touch characteristics are associated with the perception of different degrees of valence and of arousal. Moreover, in the case of non-congruent mixed signals (touch, facial expression, textual scenario) not conveying a priori the same emotion, the message conveyed by touch seems to prevail over the ones displayed by the visual and textual signals.</description><identifier>ISSN: 1949-3045</identifier><identifier>EISSN: 1949-3045</identifier><identifier>DOI: 10.1109/TAFFC.2020.3008693</identifier><identifier>CODEN: ITACBQ</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Actuators ; Aerospace electronics ; Arousal ; Computer Science ; Context ; Conveying ; embodied conversational agent ; Emotions ; facial expression ; Force ; Human-Computer Interaction ; Interactive systems ; mediated communication ; multimodal communication ; Performance evaluation ; Robot arms ; Robot control ; robot touch ; Robots ; Skin ; social touch ; social touch technologies ; Touch ; Visual signals ; Visualization</subject><ispartof>IEEE transactions on affective computing, 2022-07, Vol.13 (3), p.1477-1488</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><rights>Distributed under a Creative Commons Attribution 4.0 International License</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c439t-41c8fa4cafd915dde963b285d3bb8bd33dae13ff2337f598d027850c8764fe603</citedby><cites>FETCH-LOGICAL-c439t-41c8fa4cafd915dde963b285d3bb8bd33dae13ff2337f598d027850c8764fe603</cites><orcidid>0000-0001-8027-5765 ; 0000-0003-2950-7545 ; 0000-0003-1008-0799 ; 0000-0001-8114-1160</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9139211$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>230,314,780,784,796,885,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9139211$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://hal.sorbonne-universite.fr/hal-02949047$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Teyssier, Marc</creatorcontrib><creatorcontrib>Bailly, Gilles</creatorcontrib><creatorcontrib>Pelachaud, Catherine</creatorcontrib><creatorcontrib>Lecolinet, Eric</creatorcontrib><title>Conveying Emotions Through Device-Initiated Touch</title><title>IEEE transactions on affective computing</title><addtitle>TAFFC</addtitle><description>Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of device-initiated touch for conveying emotions with an interactive system reproducing a collection of human touch characteristics. For this purpose, we control a robotic arm to touch the forearm of participants with different force, velocity and amplitude characteristics to simulate human touch. In view of adding touch as an emotional modality in human-machine interaction, we have conducted two studies. After designing the touch device, we explore touch in a context-free setup and then in a controlled context defined by textual scenarios and emotional facial expressions of a virtual agent. Our results suggest that certain combinations of touch characteristics are associated with the perception of different degrees of valence and of arousal. Moreover, in the case of non-congruent mixed signals (touch, facial expression, textual scenario) not conveying a priori the same emotion, the message conveyed by touch seems to prevail over the ones displayed by the visual and textual signals.</description><subject>Actuators</subject><subject>Aerospace electronics</subject><subject>Arousal</subject><subject>Computer Science</subject><subject>Context</subject><subject>Conveying</subject><subject>embodied conversational agent</subject><subject>Emotions</subject><subject>facial expression</subject><subject>Force</subject><subject>Human-Computer Interaction</subject><subject>Interactive systems</subject><subject>mediated communication</subject><subject>multimodal communication</subject><subject>Performance evaluation</subject><subject>Robot arms</subject><subject>Robot control</subject><subject>robot touch</subject><subject>Robots</subject><subject>Skin</subject><subject>social touch</subject><subject>social touch technologies</subject><subject>Touch</subject><subject>Visual signals</subject><subject>Visualization</subject><issn>1949-3045</issn><issn>1949-3045</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE1rAjEQhkNpoWL9A-1loace1k4y-5EcxWoVhF6255DdJG5EN3Y_BP99165I5zLD8L7z8RDyTGFKKYj3bLZczqcMGEwRgCcC78iIikiECFF8_69-JJOm2UEfiJiwdETo3Fcnc3bVNlgcfOt81QRZWftuWwYf5uQKE64r1zrVGh1kvivKJ_Jg1b4xk2sek-_lIpuvws3X53o-24RFhKINI1pwq6JCWS1orLURCeaMxxrznOcaUStD0VqGmNpYcA0s5TEUPE0iaxLAMXkb5pZqL4-1O6j6LL1ycjXbyEsPWP8WROmJ9trXQXus_U9nmlbufFdX_XmSpdDvT-KE9yo2qIraN01t7G0sBXkhKf9IygtJeSXZm14GkzPG3AyComCU4i9SaWz6</recordid><startdate>20220701</startdate><enddate>20220701</enddate><creator>Teyssier, Marc</creator><creator>Bailly, Gilles</creator><creator>Pelachaud, Catherine</creator><creator>Lecolinet, Eric</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><general>Institute of Electrical and Electronics Engineers</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>1XC</scope><scope>VOOES</scope><orcidid>https://orcid.org/0000-0001-8027-5765</orcidid><orcidid>https://orcid.org/0000-0003-2950-7545</orcidid><orcidid>https://orcid.org/0000-0003-1008-0799</orcidid><orcidid>https://orcid.org/0000-0001-8114-1160</orcidid></search><sort><creationdate>20220701</creationdate><title>Conveying Emotions Through Device-Initiated Touch</title><author>Teyssier, Marc ; Bailly, Gilles ; Pelachaud, Catherine ; Lecolinet, Eric</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c439t-41c8fa4cafd915dde963b285d3bb8bd33dae13ff2337f598d027850c8764fe603</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Actuators</topic><topic>Aerospace electronics</topic><topic>Arousal</topic><topic>Computer Science</topic><topic>Context</topic><topic>Conveying</topic><topic>embodied conversational agent</topic><topic>Emotions</topic><topic>facial expression</topic><topic>Force</topic><topic>Human-Computer Interaction</topic><topic>Interactive systems</topic><topic>mediated communication</topic><topic>multimodal communication</topic><topic>Performance evaluation</topic><topic>Robot arms</topic><topic>Robot control</topic><topic>robot touch</topic><topic>Robots</topic><topic>Skin</topic><topic>social touch</topic><topic>social touch technologies</topic><topic>Touch</topic><topic>Visual signals</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Teyssier, Marc</creatorcontrib><creatorcontrib>Bailly, Gilles</creatorcontrib><creatorcontrib>Pelachaud, Catherine</creatorcontrib><creatorcontrib>Lecolinet, Eric</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><jtitle>IEEE transactions on affective computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Teyssier, Marc</au><au>Bailly, Gilles</au><au>Pelachaud, Catherine</au><au>Lecolinet, Eric</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Conveying Emotions Through Device-Initiated Touch</atitle><jtitle>IEEE transactions on affective computing</jtitle><stitle>TAFFC</stitle><date>2022-07-01</date><risdate>2022</risdate><volume>13</volume><issue>3</issue><spage>1477</spage><epage>1488</epage><pages>1477-1488</pages><issn>1949-3045</issn><eissn>1949-3045</eissn><coden>ITACBQ</coden><abstract>Humans have the ability to convey an array of emotions through complex and rich touch gestures. However, it is not clear how these touch gestures can be reproduced through interactive systems and devices in a remote mediated communication context. In this article, we explore the design space of device-initiated touch for conveying emotions with an interactive system reproducing a collection of human touch characteristics. For this purpose, we control a robotic arm to touch the forearm of participants with different force, velocity and amplitude characteristics to simulate human touch. In view of adding touch as an emotional modality in human-machine interaction, we have conducted two studies. After designing the touch device, we explore touch in a context-free setup and then in a controlled context defined by textual scenarios and emotional facial expressions of a virtual agent. Our results suggest that certain combinations of touch characteristics are associated with the perception of different degrees of valence and of arousal. Moreover, in the case of non-congruent mixed signals (touch, facial expression, textual scenario) not conveying a priori the same emotion, the message conveyed by touch seems to prevail over the ones displayed by the visual and textual signals.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TAFFC.2020.3008693</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-8027-5765</orcidid><orcidid>https://orcid.org/0000-0003-2950-7545</orcidid><orcidid>https://orcid.org/0000-0003-1008-0799</orcidid><orcidid>https://orcid.org/0000-0001-8114-1160</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1949-3045
ispartof IEEE transactions on affective computing, 2022-07, Vol.13 (3), p.1477-1488
issn 1949-3045
1949-3045
language eng
recordid cdi_hal_primary_oai_HAL_hal_02949047v1
source IEEE Electronic Library (IEL)
subjects Actuators
Aerospace electronics
Arousal
Computer Science
Context
Conveying
embodied conversational agent
Emotions
facial expression
Force
Human-Computer Interaction
Interactive systems
mediated communication
multimodal communication
Performance evaluation
Robot arms
Robot control
robot touch
Robots
Skin
social touch
social touch technologies
Touch
Visual signals
Visualization
title Conveying Emotions Through Device-Initiated Touch
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T23%3A46%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Conveying%20Emotions%20Through%20Device-Initiated%20Touch&rft.jtitle=IEEE%20transactions%20on%20affective%20computing&rft.au=Teyssier,%20Marc&rft.date=2022-07-01&rft.volume=13&rft.issue=3&rft.spage=1477&rft.epage=1488&rft.pages=1477-1488&rft.issn=1949-3045&rft.eissn=1949-3045&rft.coden=ITACBQ&rft_id=info:doi/10.1109/TAFFC.2020.3008693&rft_dat=%3Cproquest_RIE%3E2709156568%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2709156568&rft_id=info:pmid/&rft_ieee_id=9139211&rfr_iscdi=true