Understanding nonverbal communication cues of human personality traits in human-robot interaction

With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users&#x02BC mood, intention, and other aspects. During human-human interaction, personal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/CAA journal of automatica sinica 2020-11, Vol.7 (6), p.1465-1477
Hauptverfasser: Shen, Zhihao, Elibol, Armagan, Chong, Nak Young
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1477
container_issue 6
container_start_page 1465
container_title IEEE/CAA journal of automatica sinica
container_volume 7
creator Shen, Zhihao
Elibol, Armagan
Chong, Nak Young
description With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users&#x02BC mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user&#x02BC s personality traits based on the user&#x02BC s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient &#x0028 MFCC &#x0029 . We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant&#x02BC s habitual behavior using its on-board sensors. On the other hand, each participant&#x02BC s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine &#x0028 SVM &#x0029 classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.
doi_str_mv 10.1109/JAS.2020.1003201
format Article
fullrecord <record><control><sourceid>wanfang_jour_RIE</sourceid><recordid>TN_cdi_wanfang_journals_zdhxb_ywb202006001</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9106874</ieee_id><wanfj_id>zdhxb_ywb202006001</wanfj_id><sourcerecordid>zdhxb_ywb202006001</sourcerecordid><originalsourceid>FETCH-LOGICAL-c370t-71f19deacc01fa37c7ca76d90d98db6a3f2e00db3647b1e23043dd9c050521d03</originalsourceid><addsrcrecordid>eNpFkM1LAzEQxRdRsNTeBS8L3oStk2Q3aY6l-IngQXsO2STbbmmTmqTW-tebZUs9zQzze8O8l2XXCMYIAb9_nX6MMeA0ARAM6CwbYIJ5wTErz089pZfZKIQVACBcMcrLQSbnVhsforS6tYvcOvttfC3XuXKbzc62SsbW2VztTMhdky93G2nzbVI4K9dtPOTRyzaGvLX9rvCudjGN0XipOu1VdtHIdTCjYx1m88eHz9lz8fb-9DKbvhWKMIgFQw3i2kilADWSMMWUZFRz0HyiaypJgw2ArgktWY0MJlASrbmCCiqMNJBhdtff3UvbSLsQK7fz6ckgfvXypxaHfd1FBDS5T_BtD2-9-0re4j-Ny4pMEJ1UZaKgp5R3IXjTiK1vN9IfBALRBS9S8KK7Ko7BJ8lNL2mNMSecI6ATVpI_U8N_9g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2453816854</pqid></control><display><type>article</type><title>Understanding nonverbal communication cues of human personality traits in human-robot interaction</title><source>IEEE Electronic Library (IEL)</source><creator>Shen, Zhihao ; Elibol, Armagan ; Chong, Nak Young</creator><creatorcontrib>Shen, Zhihao ; Elibol, Armagan ; Chong, Nak Young</creatorcontrib><description><![CDATA[With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users&#x02BC mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user&#x02BC s personality traits based on the user&#x02BC s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient &#x0028 MFCC &#x0029 . We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant&#x02BC s habitual behavior using its on-board sensors. On the other hand, each participant&#x02BC s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine &#x0028 SVM &#x0029 classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.]]></description><identifier>ISSN: 2329-9266</identifier><identifier>EISSN: 2329-9274</identifier><identifier>DOI: 10.1109/JAS.2020.1003201</identifier><identifier>CODEN: IJASJC</identifier><language>eng</language><publisher>Piscataway: Chinese Association of Automation (CAA)</publisher><subject>Cameras ; Classifiers ; Communication ; Feature extraction ; Head movement ; Human behavior ; Human engineering ; Human-robot interaction ; Performance evaluation ; Personality ; Personality traits ; Questionnaires ; Robot kinematics ; Robot sensing systems ; Robots ; Support vector machines ; Synchronization ; Voice communication</subject><ispartof>IEEE/CAA journal of automatica sinica, 2020-11, Vol.7 (6), p.1465-1477</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><rights>Copyright © Wanfang Data Co. Ltd. All Rights Reserved.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c370t-71f19deacc01fa37c7ca76d90d98db6a3f2e00db3647b1e23043dd9c050521d03</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Uhttp://www.wanfangdata.com.cn/images/PeriodicalImages/zdhxb-ywb/zdhxb-ywb.jpg</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9106874$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9106874$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Shen, Zhihao</creatorcontrib><creatorcontrib>Elibol, Armagan</creatorcontrib><creatorcontrib>Chong, Nak Young</creatorcontrib><title>Understanding nonverbal communication cues of human personality traits in human-robot interaction</title><title>IEEE/CAA journal of automatica sinica</title><addtitle>JAS</addtitle><description><![CDATA[With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users&#x02BC mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user&#x02BC s personality traits based on the user&#x02BC s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient &#x0028 MFCC &#x0029 . We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant&#x02BC s habitual behavior using its on-board sensors. On the other hand, each participant&#x02BC s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine &#x0028 SVM &#x0029 classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.]]></description><subject>Cameras</subject><subject>Classifiers</subject><subject>Communication</subject><subject>Feature extraction</subject><subject>Head movement</subject><subject>Human behavior</subject><subject>Human engineering</subject><subject>Human-robot interaction</subject><subject>Performance evaluation</subject><subject>Personality</subject><subject>Personality traits</subject><subject>Questionnaires</subject><subject>Robot kinematics</subject><subject>Robot sensing systems</subject><subject>Robots</subject><subject>Support vector machines</subject><subject>Synchronization</subject><subject>Voice communication</subject><issn>2329-9266</issn><issn>2329-9274</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpFkM1LAzEQxRdRsNTeBS8L3oStk2Q3aY6l-IngQXsO2STbbmmTmqTW-tebZUs9zQzze8O8l2XXCMYIAb9_nX6MMeA0ARAM6CwbYIJ5wTErz089pZfZKIQVACBcMcrLQSbnVhsforS6tYvcOvttfC3XuXKbzc62SsbW2VztTMhdky93G2nzbVI4K9dtPOTRyzaGvLX9rvCudjGN0XipOu1VdtHIdTCjYx1m88eHz9lz8fb-9DKbvhWKMIgFQw3i2kilADWSMMWUZFRz0HyiaypJgw2ArgktWY0MJlASrbmCCiqMNJBhdtff3UvbSLsQK7fz6ckgfvXypxaHfd1FBDS5T_BtD2-9-0re4j-Ny4pMEJ1UZaKgp5R3IXjTiK1vN9IfBALRBS9S8KK7Ko7BJ8lNL2mNMSecI6ATVpI_U8N_9g</recordid><startdate>20201101</startdate><enddate>20201101</enddate><creator>Shen, Zhihao</creator><creator>Elibol, Armagan</creator><creator>Chong, Nak Young</creator><general>Chinese Association of Automation (CAA)</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><general>Japan Advanced Institute of Science and Technology, Ishikawa 923-1211, Japan</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>2B.</scope><scope>4A8</scope><scope>92I</scope><scope>93N</scope><scope>PSX</scope><scope>TCJ</scope></search><sort><creationdate>20201101</creationdate><title>Understanding nonverbal communication cues of human personality traits in human-robot interaction</title><author>Shen, Zhihao ; Elibol, Armagan ; Chong, Nak Young</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c370t-71f19deacc01fa37c7ca76d90d98db6a3f2e00db3647b1e23043dd9c050521d03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Cameras</topic><topic>Classifiers</topic><topic>Communication</topic><topic>Feature extraction</topic><topic>Head movement</topic><topic>Human behavior</topic><topic>Human engineering</topic><topic>Human-robot interaction</topic><topic>Performance evaluation</topic><topic>Personality</topic><topic>Personality traits</topic><topic>Questionnaires</topic><topic>Robot kinematics</topic><topic>Robot sensing systems</topic><topic>Robots</topic><topic>Support vector machines</topic><topic>Synchronization</topic><topic>Voice communication</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Shen, Zhihao</creatorcontrib><creatorcontrib>Elibol, Armagan</creatorcontrib><creatorcontrib>Chong, Nak Young</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Wanfang Data Journals - Hong Kong</collection><collection>WANFANG Data Centre</collection><collection>Wanfang Data Journals</collection><collection>万方数据期刊 - 香港版</collection><collection>China Online Journals (COJ)</collection><collection>China Online Journals (COJ)</collection><jtitle>IEEE/CAA journal of automatica sinica</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Shen, Zhihao</au><au>Elibol, Armagan</au><au>Chong, Nak Young</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Understanding nonverbal communication cues of human personality traits in human-robot interaction</atitle><jtitle>IEEE/CAA journal of automatica sinica</jtitle><stitle>JAS</stitle><date>2020-11-01</date><risdate>2020</risdate><volume>7</volume><issue>6</issue><spage>1465</spage><epage>1477</epage><pages>1465-1477</pages><issn>2329-9266</issn><eissn>2329-9274</eissn><coden>IJASJC</coden><abstract><![CDATA[With the increasing presence of robots in our daily life, there is a strong need and demand for the strategies to acquire a high quality interaction between robots and users by enabling robots to understand users&#x02BC mood, intention, and other aspects. During human-human interaction, personality traits have an important influence on human behavior, decision, mood, and many others. Therefore, we propose an efficient computational framework to endow the robot with the capability of understanding the user&#x02BC s personality traits based on the user&#x02BC s nonverbal communication cues represented by three visual features including the head motion, gaze, and body motion energy, and three vocal features including voice pitch, voice energy, and mel-frequency cepstral coefficient &#x0028 MFCC &#x0029 . We used the Pepper robot in this study as a communication robot to interact with each participant by asking questions, and meanwhile, the robot extracts the nonverbal features from each participant&#x02BC s habitual behavior using its on-board sensors. On the other hand, each participant&#x02BC s personality traits are evaluated with a questionnaire. We then train the ridge regression and linear support vector machine &#x0028 SVM &#x0029 classifiers using the nonverbal features and personality trait labels from a questionnaire and evaluate the performance of the classifiers. We have verified the validity of the proposed models that showed promising binary classification performance on recognizing each of the Big Five personality traits of the participants based on individual differences in nonverbal communication cues.]]></abstract><cop>Piscataway</cop><pub>Chinese Association of Automation (CAA)</pub><doi>10.1109/JAS.2020.1003201</doi><tpages>13</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 2329-9266
ispartof IEEE/CAA journal of automatica sinica, 2020-11, Vol.7 (6), p.1465-1477
issn 2329-9266
2329-9274
language eng
recordid cdi_wanfang_journals_zdhxb_ywb202006001
source IEEE Electronic Library (IEL)
subjects Cameras
Classifiers
Communication
Feature extraction
Head movement
Human behavior
Human engineering
Human-robot interaction
Performance evaluation
Personality
Personality traits
Questionnaires
Robot kinematics
Robot sensing systems
Robots
Support vector machines
Synchronization
Voice communication
title Understanding nonverbal communication cues of human personality traits in human-robot interaction
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T22%3A37%3A02IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-wanfang_jour_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Understanding%20nonverbal%20communication%20cues%20of%20human%20personality%20traits%20in%20human-robot%20interaction&rft.jtitle=IEEE/CAA%20journal%20of%20automatica%20sinica&rft.au=Shen,%20Zhihao&rft.date=2020-11-01&rft.volume=7&rft.issue=6&rft.spage=1465&rft.epage=1477&rft.pages=1465-1477&rft.issn=2329-9266&rft.eissn=2329-9274&rft.coden=IJASJC&rft_id=info:doi/10.1109/JAS.2020.1003201&rft_dat=%3Cwanfang_jour_RIE%3Ezdhxb_ywb202006001%3C/wanfang_jour_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2453816854&rft_id=info:pmid/&rft_ieee_id=9106874&rft_wanfj_id=zdhxb_ywb202006001&rfr_iscdi=true