Robotic Emotional Expression Generation Based on Mood Transition and Personality Model

This paper presents a method of mood transition design of a robot for autonomous emotional interaction with humans. A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions. In this design, the robot personality is programmed by adj...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cybernetics 2013-08, Vol.43 (4), p.1290-1303
Hauptverfasser: Han, Meng-Ju, Lin, Chia-How, Song, Kai-Tai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1303
container_issue 4
container_start_page 1290
container_title IEEE transactions on cybernetics
container_volume 43
creator Han, Meng-Ju
Lin, Chia-How
Song, Kai-Tai
description This paper presents a method of mood transition design of a robot for autonomous emotional interaction with humans. A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions. In this design, the robot personality is programmed by adjusting the factors of the five factor model proposed by psychologists. From Big Five personality traits, the influence factors of robot mood transition are determined. Furthermore, a method to fuse basic robotic emotional behaviors is proposed in order to manifest robotic emotional states via continuous facial expressions. An artificial face on a screen is a way to provide a robot with a humanlike appearance, which might be useful for human-robot interaction. An artificial face simulator has been implemented to show the effectiveness of the proposed methods. Questionnaire surveys have been carried out to evaluate the effectiveness of the proposed method by observing robotic responses to a user's emotional expressions. Preliminary experimental results on a robotic head show that the proposed mood state transition scheme appropriately responds to a user's emotional changes in a continuous manner.
doi_str_mv 10.1109/TSMCB.2012.2228851
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_miscellaneous_1727988792</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6376252</ieee_id><sourcerecordid>3024379501</sourcerecordid><originalsourceid>FETCH-LOGICAL-c543t-2e5b2c12f9a4da90da4ade6ee2f630df1fd00d2401d2d5141b5cf0a17e064e283</originalsourceid><addsrcrecordid>eNqFkU9LwzAYxoMobsx9AQUpePHSmbxN0vToxpyCQ9HptWTNW-jompls4L696TZ38GIu77_f80B4CLlkdMAYze5m79PRcACUwQAAlBLshHSBSRUDpOL02Mu0Q_reL2h4KqwydU46IAUFnqRd8vlm53ZdFdF4GYptdB2Nv1cOvQ9DNMEGnW730VB7NFFoptaaaOZ046vdQTcmekXnW2213oa7wfqCnJW69tg_1B75eBjPRo_x88vkaXT_HBeCJ-sYUMyhYFBmmhudUaO5NigRoZQJNSUrDaUGOGUGjGCczUVRUs1SpJIjqKRHbve-K2e_NujX-bLyBda1btBufM5SSDOl0gz-RzlkSgqmWtebP-jCblz4X0tRrpgAmgUK9lThrPcOy3zlqqV225zRvM0o32WUtxnlh4yC6PpgvZkv0Rwlv4kE4GoPVIh4PMsklSAg-QGiMJSL</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1404815209</pqid></control><display><type>article</type><title>Robotic Emotional Expression Generation Based on Mood Transition and Personality Model</title><source>IEEE Electronic Library (IEL)</source><creator>Han, Meng-Ju ; Lin, Chia-How ; Song, Kai-Tai</creator><creatorcontrib>Han, Meng-Ju ; Lin, Chia-How ; Song, Kai-Tai</creatorcontrib><description>This paper presents a method of mood transition design of a robot for autonomous emotional interaction with humans. A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions. In this design, the robot personality is programmed by adjusting the factors of the five factor model proposed by psychologists. From Big Five personality traits, the influence factors of robot mood transition are determined. Furthermore, a method to fuse basic robotic emotional behaviors is proposed in order to manifest robotic emotional states via continuous facial expressions. An artificial face on a screen is a way to provide a robot with a humanlike appearance, which might be useful for human-robot interaction. An artificial face simulator has been implemented to show the effectiveness of the proposed methods. Questionnaire surveys have been carried out to evaluate the effectiveness of the proposed method by observing robotic responses to a user's emotional expressions. Preliminary experimental results on a robotic head show that the proposed mood state transition scheme appropriately responds to a user's emotional changes in a continuous manner.</description><identifier>ISSN: 2168-2267</identifier><identifier>EISSN: 2168-2275</identifier><identifier>DOI: 10.1109/TSMCB.2012.2228851</identifier><identifier>PMID: 26502437</identifier><identifier>CODEN: ITCEB8</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Affect - physiology ; Cybernetics ; Cybernetics - methods ; Emotion recognition ; Emotional model ; Emotions ; Face ; Facial Expression ; facial expression generation ; facial expression recognition ; Fuses ; Humans ; Mathematical models ; Models, Biological ; Mood ; Moods ; Personality ; Personality - physiology ; Prototypes ; Robot kinematics ; robotic behavior fusion ; robotic emotional interactions ; robotic mood state transition ; Robotics ; Robotics - methods ; Robots ; Studies</subject><ispartof>IEEE transactions on cybernetics, 2013-08, Vol.43 (4), p.1290-1303</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Aug 2013</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c543t-2e5b2c12f9a4da90da4ade6ee2f630df1fd00d2401d2d5141b5cf0a17e064e283</citedby><cites>FETCH-LOGICAL-c543t-2e5b2c12f9a4da90da4ade6ee2f630df1fd00d2401d2d5141b5cf0a17e064e283</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6376252$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6376252$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/26502437$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Han, Meng-Ju</creatorcontrib><creatorcontrib>Lin, Chia-How</creatorcontrib><creatorcontrib>Song, Kai-Tai</creatorcontrib><title>Robotic Emotional Expression Generation Based on Mood Transition and Personality Model</title><title>IEEE transactions on cybernetics</title><addtitle>TCYB</addtitle><addtitle>IEEE Trans Cybern</addtitle><description>This paper presents a method of mood transition design of a robot for autonomous emotional interaction with humans. A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions. In this design, the robot personality is programmed by adjusting the factors of the five factor model proposed by psychologists. From Big Five personality traits, the influence factors of robot mood transition are determined. Furthermore, a method to fuse basic robotic emotional behaviors is proposed in order to manifest robotic emotional states via continuous facial expressions. An artificial face on a screen is a way to provide a robot with a humanlike appearance, which might be useful for human-robot interaction. An artificial face simulator has been implemented to show the effectiveness of the proposed methods. Questionnaire surveys have been carried out to evaluate the effectiveness of the proposed method by observing robotic responses to a user's emotional expressions. Preliminary experimental results on a robotic head show that the proposed mood state transition scheme appropriately responds to a user's emotional changes in a continuous manner.</description><subject>Affect - physiology</subject><subject>Cybernetics</subject><subject>Cybernetics - methods</subject><subject>Emotion recognition</subject><subject>Emotional model</subject><subject>Emotions</subject><subject>Face</subject><subject>Facial Expression</subject><subject>facial expression generation</subject><subject>facial expression recognition</subject><subject>Fuses</subject><subject>Humans</subject><subject>Mathematical models</subject><subject>Models, Biological</subject><subject>Mood</subject><subject>Moods</subject><subject>Personality</subject><subject>Personality - physiology</subject><subject>Prototypes</subject><subject>Robot kinematics</subject><subject>robotic behavior fusion</subject><subject>robotic emotional interactions</subject><subject>robotic mood state transition</subject><subject>Robotics</subject><subject>Robotics - methods</subject><subject>Robots</subject><subject>Studies</subject><issn>2168-2267</issn><issn>2168-2275</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2013</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNqFkU9LwzAYxoMobsx9AQUpePHSmbxN0vToxpyCQ9HptWTNW-jompls4L696TZ38GIu77_f80B4CLlkdMAYze5m79PRcACUwQAAlBLshHSBSRUDpOL02Mu0Q_reL2h4KqwydU46IAUFnqRd8vlm53ZdFdF4GYptdB2Nv1cOvQ9DNMEGnW730VB7NFFoptaaaOZ046vdQTcmekXnW2213oa7wfqCnJW69tg_1B75eBjPRo_x88vkaXT_HBeCJ-sYUMyhYFBmmhudUaO5NigRoZQJNSUrDaUGOGUGjGCczUVRUs1SpJIjqKRHbve-K2e_NujX-bLyBda1btBufM5SSDOl0gz-RzlkSgqmWtebP-jCblz4X0tRrpgAmgUK9lThrPcOy3zlqqV225zRvM0o32WUtxnlh4yC6PpgvZkv0Rwlv4kE4GoPVIh4PMsklSAg-QGiMJSL</recordid><startdate>201308</startdate><enddate>201308</enddate><creator>Han, Meng-Ju</creator><creator>Lin, Chia-How</creator><creator>Song, Kai-Tai</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>7X8</scope></search><sort><creationdate>201308</creationdate><title>Robotic Emotional Expression Generation Based on Mood Transition and Personality Model</title><author>Han, Meng-Ju ; Lin, Chia-How ; Song, Kai-Tai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c543t-2e5b2c12f9a4da90da4ade6ee2f630df1fd00d2401d2d5141b5cf0a17e064e283</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2013</creationdate><topic>Affect - physiology</topic><topic>Cybernetics</topic><topic>Cybernetics - methods</topic><topic>Emotion recognition</topic><topic>Emotional model</topic><topic>Emotions</topic><topic>Face</topic><topic>Facial Expression</topic><topic>facial expression generation</topic><topic>facial expression recognition</topic><topic>Fuses</topic><topic>Humans</topic><topic>Mathematical models</topic><topic>Models, Biological</topic><topic>Mood</topic><topic>Moods</topic><topic>Personality</topic><topic>Personality - physiology</topic><topic>Prototypes</topic><topic>Robot kinematics</topic><topic>robotic behavior fusion</topic><topic>robotic emotional interactions</topic><topic>robotic mood state transition</topic><topic>Robotics</topic><topic>Robotics - methods</topic><topic>Robots</topic><topic>Studies</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Han, Meng-Ju</creatorcontrib><creatorcontrib>Lin, Chia-How</creatorcontrib><creatorcontrib>Song, Kai-Tai</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>MEDLINE - Academic</collection><jtitle>IEEE transactions on cybernetics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Han, Meng-Ju</au><au>Lin, Chia-How</au><au>Song, Kai-Tai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Robotic Emotional Expression Generation Based on Mood Transition and Personality Model</atitle><jtitle>IEEE transactions on cybernetics</jtitle><stitle>TCYB</stitle><addtitle>IEEE Trans Cybern</addtitle><date>2013-08</date><risdate>2013</risdate><volume>43</volume><issue>4</issue><spage>1290</spage><epage>1303</epage><pages>1290-1303</pages><issn>2168-2267</issn><eissn>2168-2275</eissn><coden>ITCEB8</coden><abstract>This paper presents a method of mood transition design of a robot for autonomous emotional interaction with humans. A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions. In this design, the robot personality is programmed by adjusting the factors of the five factor model proposed by psychologists. From Big Five personality traits, the influence factors of robot mood transition are determined. Furthermore, a method to fuse basic robotic emotional behaviors is proposed in order to manifest robotic emotional states via continuous facial expressions. An artificial face on a screen is a way to provide a robot with a humanlike appearance, which might be useful for human-robot interaction. An artificial face simulator has been implemented to show the effectiveness of the proposed methods. Questionnaire surveys have been carried out to evaluate the effectiveness of the proposed method by observing robotic responses to a user's emotional expressions. Preliminary experimental results on a robotic head show that the proposed mood state transition scheme appropriately responds to a user's emotional changes in a continuous manner.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>26502437</pmid><doi>10.1109/TSMCB.2012.2228851</doi><tpages>14</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2168-2267
ispartof IEEE transactions on cybernetics, 2013-08, Vol.43 (4), p.1290-1303
issn 2168-2267
2168-2275
language eng
recordid cdi_proquest_miscellaneous_1727988792
source IEEE Electronic Library (IEL)
subjects Affect - physiology
Cybernetics
Cybernetics - methods
Emotion recognition
Emotional model
Emotions
Face
Facial Expression
facial expression generation
facial expression recognition
Fuses
Humans
Mathematical models
Models, Biological
Mood
Moods
Personality
Personality - physiology
Prototypes
Robot kinematics
robotic behavior fusion
robotic emotional interactions
robotic mood state transition
Robotics
Robotics - methods
Robots
Studies
title Robotic Emotional Expression Generation Based on Mood Transition and Personality Model
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T07%3A13%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Robotic%20Emotional%20Expression%20Generation%20Based%20on%20Mood%20Transition%20and%20Personality%20Model&rft.jtitle=IEEE%20transactions%20on%20cybernetics&rft.au=Han,%20Meng-Ju&rft.date=2013-08&rft.volume=43&rft.issue=4&rft.spage=1290&rft.epage=1303&rft.pages=1290-1303&rft.issn=2168-2267&rft.eissn=2168-2275&rft.coden=ITCEB8&rft_id=info:doi/10.1109/TSMCB.2012.2228851&rft_dat=%3Cproquest_RIE%3E3024379501%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1404815209&rft_id=info:pmid/26502437&rft_ieee_id=6376252&rfr_iscdi=true