Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot

Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associate...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of social robotics 2018-04, Vol.10 (2), p.199-209
Hauptverfasser: Menne, Isabelle M., Schwab, Frank
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 209
container_issue 2
container_start_page 199
container_title International journal of social robotics
container_volume 10
creator Menne, Isabelle M.
Schwab, Frank
description Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associated with emotions. Facial expressions could provide an ideal tool for a natural, social human–robot interaction. Despite a growing body of research on the implementation of emotions in robots (mostly based on facial expressions), systematic research on users’ emotions and facial expressions towards robots remains largely neglected (cf. Arkin and Moshkina in Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493, 2015 on challenges in effective testing in affective human–robot interaction). We experimentally investigated the multilevel phenomenon of emotions by using a multi-method approach. Since self-reports of emotions are prone to biases such as social desirability, we supplemented it by an objective behavioral measurement. By using the Facial Action Coding System we analyzed the facial expressions of 62 participants who watched the entertainment robot dinosaur Pleo either in a friendly interaction or being tortured. Participants differed in the type and frequency of Action Units displayed as well as in their self-reported feelings depending on the type of treatment they had watched (friendly or torture). In line with a previous study by Rosenthal-von der Pütten et al. (Int J Soc Robot 5(1):17–34, 2013 . https://doi.org/10.1007/s12369-012-0173-8 ), participants reported feeling more positive after the friendly video and more negative after the torture video. In the torture condition, participants furthermore showed a wide range of different Action Units primarily associated with negative emotions. For example, the Action Unit 4 (“Brow Lowerer”) that is common in negative emotions such as anger and sadness was displayed more frequently in the torture condition than in the friendly condition. The Action Unit 12 (“Lip Corner Puller”) however, an Action Unit commonly associated with joy, was present in both conditions and thus not necessarily predictive of positive emotions. The findings indicate the importance for a thorough investigation of the variables of emotional facial expressions. In investigating the Action Units participants display due to an emotional situation, we aim to pro
doi_str_mv 10.1007/s12369-017-0447-2
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2421250823</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2421250823</sourcerecordid><originalsourceid>FETCH-LOGICAL-c364t-acf10e6c4a5c9dadd21ec3a9009541a3e5e7b996ed08f948c39618a54f3907493</originalsourceid><addsrcrecordid>eNp1UE1LAzEUDKJg0f4AbwHPqy9fm403Ka0WCoLUc0iz2bKl3dS8rR__3pS1ePJd5jHMDMMQcsPgjgHoe2RclKYApguQUhf8jIxYpVUhK1Dnp18bdknGiBvIJ7jWuhyRxcz5gDQ2dLqLfRu7BzrvPgL27dr1bbc-0W5Ls7LNMP3ap4CYOaTL-OlSjdTR17iK_TW5aNwWw_gXr8jbbLqcPBeLl6f55HFReFHKvnC-YRBKL53ypnZ1zVnwwhkAoyRzIqigV8aUoYaqMbLywpSscko2woCWRlyR2yF3n-L7IZe1m3hIuSNaLjnjCiousooNKp8iYgqN3ad259K3ZWCPu9lhN5t3s8fdLM8ePngwa7t1SH_J_5t-AIe8by0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2421250823</pqid></control><display><type>article</type><title>Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot</title><source>SpringerLink Journals</source><creator>Menne, Isabelle M. ; Schwab, Frank</creator><creatorcontrib>Menne, Isabelle M. ; Schwab, Frank</creatorcontrib><description>Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associated with emotions. Facial expressions could provide an ideal tool for a natural, social human–robot interaction. Despite a growing body of research on the implementation of emotions in robots (mostly based on facial expressions), systematic research on users’ emotions and facial expressions towards robots remains largely neglected (cf. Arkin and Moshkina in Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493, 2015 on challenges in effective testing in affective human–robot interaction). We experimentally investigated the multilevel phenomenon of emotions by using a multi-method approach. Since self-reports of emotions are prone to biases such as social desirability, we supplemented it by an objective behavioral measurement. By using the Facial Action Coding System we analyzed the facial expressions of 62 participants who watched the entertainment robot dinosaur Pleo either in a friendly interaction or being tortured. Participants differed in the type and frequency of Action Units displayed as well as in their self-reported feelings depending on the type of treatment they had watched (friendly or torture). In line with a previous study by Rosenthal-von der Pütten et al. (Int J Soc Robot 5(1):17–34, 2013 . https://doi.org/10.1007/s12369-012-0173-8 ), participants reported feeling more positive after the friendly video and more negative after the torture video. In the torture condition, participants furthermore showed a wide range of different Action Units primarily associated with negative emotions. For example, the Action Unit 4 (“Brow Lowerer”) that is common in negative emotions such as anger and sadness was displayed more frequently in the torture condition than in the friendly condition. The Action Unit 12 (“Lip Corner Puller”) however, an Action Unit commonly associated with joy, was present in both conditions and thus not necessarily predictive of positive emotions. The findings indicate the importance for a thorough investigation of the variables of emotional facial expressions. In investigating the Action Units participants display due to an emotional situation, we aim to provide information on spontaneous facial expressions towards a robot that could also serve as guidance for automatic approaches.</description><identifier>ISSN: 1875-4791</identifier><identifier>EISSN: 1875-4805</identifier><identifier>DOI: 10.1007/s12369-017-0447-2</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Affective computing ; Control ; Emotions ; Engineering ; Investigations ; Mechatronics ; Robotics ; Robots ; Torture</subject><ispartof>International journal of social robotics, 2018-04, Vol.10 (2), p.199-209</ispartof><rights>Springer Science+Business Media B.V., part of Springer Nature 2017</rights><rights>Springer Science+Business Media B.V., part of Springer Nature 2017.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c364t-acf10e6c4a5c9dadd21ec3a9009541a3e5e7b996ed08f948c39618a54f3907493</citedby><cites>FETCH-LOGICAL-c364t-acf10e6c4a5c9dadd21ec3a9009541a3e5e7b996ed08f948c39618a54f3907493</cites><orcidid>0000-0002-4346-1726</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s12369-017-0447-2$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s12369-017-0447-2$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Menne, Isabelle M.</creatorcontrib><creatorcontrib>Schwab, Frank</creatorcontrib><title>Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot</title><title>International journal of social robotics</title><addtitle>Int J of Soc Robotics</addtitle><description>Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associated with emotions. Facial expressions could provide an ideal tool for a natural, social human–robot interaction. Despite a growing body of research on the implementation of emotions in robots (mostly based on facial expressions), systematic research on users’ emotions and facial expressions towards robots remains largely neglected (cf. Arkin and Moshkina in Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493, 2015 on challenges in effective testing in affective human–robot interaction). We experimentally investigated the multilevel phenomenon of emotions by using a multi-method approach. Since self-reports of emotions are prone to biases such as social desirability, we supplemented it by an objective behavioral measurement. By using the Facial Action Coding System we analyzed the facial expressions of 62 participants who watched the entertainment robot dinosaur Pleo either in a friendly interaction or being tortured. Participants differed in the type and frequency of Action Units displayed as well as in their self-reported feelings depending on the type of treatment they had watched (friendly or torture). In line with a previous study by Rosenthal-von der Pütten et al. (Int J Soc Robot 5(1):17–34, 2013 . https://doi.org/10.1007/s12369-012-0173-8 ), participants reported feeling more positive after the friendly video and more negative after the torture video. In the torture condition, participants furthermore showed a wide range of different Action Units primarily associated with negative emotions. For example, the Action Unit 4 (“Brow Lowerer”) that is common in negative emotions such as anger and sadness was displayed more frequently in the torture condition than in the friendly condition. The Action Unit 12 (“Lip Corner Puller”) however, an Action Unit commonly associated with joy, was present in both conditions and thus not necessarily predictive of positive emotions. The findings indicate the importance for a thorough investigation of the variables of emotional facial expressions. In investigating the Action Units participants display due to an emotional situation, we aim to provide information on spontaneous facial expressions towards a robot that could also serve as guidance for automatic approaches.</description><subject>Affective computing</subject><subject>Control</subject><subject>Emotions</subject><subject>Engineering</subject><subject>Investigations</subject><subject>Mechatronics</subject><subject>Robotics</subject><subject>Robots</subject><subject>Torture</subject><issn>1875-4791</issn><issn>1875-4805</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp1UE1LAzEUDKJg0f4AbwHPqy9fm403Ka0WCoLUc0iz2bKl3dS8rR__3pS1ePJd5jHMDMMQcsPgjgHoe2RclKYApguQUhf8jIxYpVUhK1Dnp18bdknGiBvIJ7jWuhyRxcz5gDQ2dLqLfRu7BzrvPgL27dr1bbc-0W5Ls7LNMP3ap4CYOaTL-OlSjdTR17iK_TW5aNwWw_gXr8jbbLqcPBeLl6f55HFReFHKvnC-YRBKL53ypnZ1zVnwwhkAoyRzIqigV8aUoYaqMbLywpSscko2woCWRlyR2yF3n-L7IZe1m3hIuSNaLjnjCiousooNKp8iYgqN3ad259K3ZWCPu9lhN5t3s8fdLM8ePngwa7t1SH_J_5t-AIe8by0</recordid><startdate>20180401</startdate><enddate>20180401</enddate><creator>Menne, Isabelle M.</creator><creator>Schwab, Frank</creator><general>Springer Netherlands</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><orcidid>https://orcid.org/0000-0002-4346-1726</orcidid></search><sort><creationdate>20180401</creationdate><title>Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot</title><author>Menne, Isabelle M. ; Schwab, Frank</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c364t-acf10e6c4a5c9dadd21ec3a9009541a3e5e7b996ed08f948c39618a54f3907493</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Affective computing</topic><topic>Control</topic><topic>Emotions</topic><topic>Engineering</topic><topic>Investigations</topic><topic>Mechatronics</topic><topic>Robotics</topic><topic>Robots</topic><topic>Torture</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Menne, Isabelle M.</creatorcontrib><creatorcontrib>Schwab, Frank</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><jtitle>International journal of social robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Menne, Isabelle M.</au><au>Schwab, Frank</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot</atitle><jtitle>International journal of social robotics</jtitle><stitle>Int J of Soc Robotics</stitle><date>2018-04-01</date><risdate>2018</risdate><volume>10</volume><issue>2</issue><spage>199</spage><epage>209</epage><pages>199-209</pages><issn>1875-4791</issn><eissn>1875-4805</eissn><abstract>Emotions have always been an intriguing topic in everyday life as well as in science. As robots are starting to move from industry halls to our private homes, emotions have become a vital theme for the field of human–robot interaction. Since Darwin, research suggests facial expressions are associated with emotions. Facial expressions could provide an ideal tool for a natural, social human–robot interaction. Despite a growing body of research on the implementation of emotions in robots (mostly based on facial expressions), systematic research on users’ emotions and facial expressions towards robots remains largely neglected (cf. Arkin and Moshkina in Calvo R, D’Mello S, Gratch J, Kappas A (eds) The Oxford handbook of affective computing. Oxford University Press, New York, pp 483–493, 2015 on challenges in effective testing in affective human–robot interaction). We experimentally investigated the multilevel phenomenon of emotions by using a multi-method approach. Since self-reports of emotions are prone to biases such as social desirability, we supplemented it by an objective behavioral measurement. By using the Facial Action Coding System we analyzed the facial expressions of 62 participants who watched the entertainment robot dinosaur Pleo either in a friendly interaction or being tortured. Participants differed in the type and frequency of Action Units displayed as well as in their self-reported feelings depending on the type of treatment they had watched (friendly or torture). In line with a previous study by Rosenthal-von der Pütten et al. (Int J Soc Robot 5(1):17–34, 2013 . https://doi.org/10.1007/s12369-012-0173-8 ), participants reported feeling more positive after the friendly video and more negative after the torture video. In the torture condition, participants furthermore showed a wide range of different Action Units primarily associated with negative emotions. For example, the Action Unit 4 (“Brow Lowerer”) that is common in negative emotions such as anger and sadness was displayed more frequently in the torture condition than in the friendly condition. The Action Unit 12 (“Lip Corner Puller”) however, an Action Unit commonly associated with joy, was present in both conditions and thus not necessarily predictive of positive emotions. The findings indicate the importance for a thorough investigation of the variables of emotional facial expressions. In investigating the Action Units participants display due to an emotional situation, we aim to provide information on spontaneous facial expressions towards a robot that could also serve as guidance for automatic approaches.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><doi>10.1007/s12369-017-0447-2</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0002-4346-1726</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1875-4791
ispartof International journal of social robotics, 2018-04, Vol.10 (2), p.199-209
issn 1875-4791
1875-4805
language eng
recordid cdi_proquest_journals_2421250823
source SpringerLink Journals
subjects Affective computing
Control
Emotions
Engineering
Investigations
Mechatronics
Robotics
Robots
Torture
title Faces of Emotion: Investigating Emotional Facial Expressions Towards a Robot
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T03%3A40%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Faces%20of%20Emotion:%20Investigating%20Emotional%20Facial%20Expressions%20Towards%20a%20Robot&rft.jtitle=International%20journal%20of%20social%20robotics&rft.au=Menne,%20Isabelle%20M.&rft.date=2018-04-01&rft.volume=10&rft.issue=2&rft.spage=199&rft.epage=209&rft.pages=199-209&rft.issn=1875-4791&rft.eissn=1875-4805&rft_id=info:doi/10.1007/s12369-017-0447-2&rft_dat=%3Cproquest_cross%3E2421250823%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2421250823&rft_id=info:pmid/&rfr_iscdi=true