Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study

Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. The objective of our study was to investigate the use of movement sensor data from a smart...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:JMIR mental health 2018-08, Vol.5 (3), p.e10153-e10153
Hauptverfasser: Quiroz, Juan Carlos, Geangu, Elena, Yong, Min Hooi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page e10153
container_issue 3
container_start_page e10153
container_title JMIR mental health
container_volume 5
creator Quiroz, Juan Carlos
Geangu, Elena
Yong, Min Hooi
description Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings of a user study with 50 participants. The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual "movie clips" and audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual's emotion. Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P
doi_str_mv 10.2196/10153
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6105867</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2086260269</sourcerecordid><originalsourceid>FETCH-LOGICAL-c457t-b810df0c0bca9a8ae69420e55f52c64ffde3d156a71e7b1fae797cac8c376d5d3</originalsourceid><addsrcrecordid>eNpdkd1LwzAUxYMobsz9C1IQwZdq0i5fPgiyzQ-YCM7hY0jTtMvoktm04v574zZlyn24F-6PwzkcAPoIXiaIkysEEU4PQDdJCYspx_xw7-6AvvcLCAPDwqBj0EkhZJwg2AXj8dI1xtnoRStXWrO5Z97YMpouZd1Eb7JR82iqrXd1NJKNvI6ezKfO45H2prTRtGnz9Qk4KmTldX-3e2B2N34dPsST5_vH4e0kVgNMmzhjCOYFVDBTkksmNeGDBGqMC5woMiiKXKc5wkRSpGmGCqkpp0oqplJKcpynPXCz1V212VLnStumlpVY1SZ4XQsnjfj7sWYuSvchQlbMCA0CFzuB2r232jdiabzSVSWtdq0XCWQkITAhPKBn_9CFa2sb4okEI8IpZBAF6nxLqdp5X-vi1wyC4rsbsekmcKf7zn-pnybSL9DEiFo</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2516970801</pqid></control><display><type>article</type><title>Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study</title><source>DOAJ Directory of Open Access Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><source>PubMed Central</source><source>PubMed Central Open Access</source><creator>Quiroz, Juan Carlos ; Geangu, Elena ; Yong, Min Hooi</creator><creatorcontrib>Quiroz, Juan Carlos ; Geangu, Elena ; Yong, Min Hooi</creatorcontrib><description>Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings of a user study with 50 participants. The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual "movie clips" and audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual's emotion. Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P&lt;.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.</description><identifier>ISSN: 2368-7959</identifier><identifier>EISSN: 2368-7959</identifier><identifier>DOI: 10.2196/10153</identifier><identifier>PMID: 30089610</identifier><language>eng</language><publisher>Canada: JMIR Publications</publisher><subject>Accelerometers ; Cellular telephones ; Emotions ; Happiness ; Movement ; Original Paper ; Sensors ; Walking</subject><ispartof>JMIR mental health, 2018-08, Vol.5 (3), p.e10153-e10153</ispartof><rights>Juan Carlos Quiroz, Elena Geangu, Min Hooi Yong. Originally published in JMIR Mental Health (http://mental.jmir.org), 08.08.2018.</rights><rights>2018. This work is licensed under https://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>Juan Carlos Quiroz, Elena Geangu, Min Hooi Yong. Originally published in JMIR Mental Health (http://mental.jmir.org), 08.08.2018. 2018</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c457t-b810df0c0bca9a8ae69420e55f52c64ffde3d156a71e7b1fae797cac8c376d5d3</citedby><cites>FETCH-LOGICAL-c457t-b810df0c0bca9a8ae69420e55f52c64ffde3d156a71e7b1fae797cac8c376d5d3</cites><orcidid>0000-0002-4669-7013 ; 0000-0002-0398-8398 ; 0000-0003-0241-5376</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6105867/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6105867/$$EHTML$$P50$$Gpubmedcentral$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,860,881,27901,27902,53766,53768</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/30089610$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Quiroz, Juan Carlos</creatorcontrib><creatorcontrib>Geangu, Elena</creatorcontrib><creatorcontrib>Yong, Min Hooi</creatorcontrib><title>Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study</title><title>JMIR mental health</title><addtitle>JMIR Ment Health</addtitle><description>Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings of a user study with 50 participants. The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual "movie clips" and audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual's emotion. Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P&lt;.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.</description><subject>Accelerometers</subject><subject>Cellular telephones</subject><subject>Emotions</subject><subject>Happiness</subject><subject>Movement</subject><subject>Original Paper</subject><subject>Sensors</subject><subject>Walking</subject><issn>2368-7959</issn><issn>2368-7959</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNpdkd1LwzAUxYMobsz9C1IQwZdq0i5fPgiyzQ-YCM7hY0jTtMvoktm04v574zZlyn24F-6PwzkcAPoIXiaIkysEEU4PQDdJCYspx_xw7-6AvvcLCAPDwqBj0EkhZJwg2AXj8dI1xtnoRStXWrO5Z97YMpouZd1Eb7JR82iqrXd1NJKNvI6ezKfO45H2prTRtGnz9Qk4KmTldX-3e2B2N34dPsST5_vH4e0kVgNMmzhjCOYFVDBTkksmNeGDBGqMC5woMiiKXKc5wkRSpGmGCqkpp0oqplJKcpynPXCz1V212VLnStumlpVY1SZ4XQsnjfj7sWYuSvchQlbMCA0CFzuB2r232jdiabzSVSWtdq0XCWQkITAhPKBn_9CFa2sb4okEI8IpZBAF6nxLqdp5X-vi1wyC4rsbsekmcKf7zn-pnybSL9DEiFo</recordid><startdate>20180808</startdate><enddate>20180808</enddate><creator>Quiroz, Juan Carlos</creator><creator>Geangu, Elena</creator><creator>Yong, Min Hooi</creator><general>JMIR Publications</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88G</scope><scope>8C1</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>K9.</scope><scope>KB0</scope><scope>M0S</scope><scope>M2M</scope><scope>NAPCQ</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PJZUB</scope><scope>PKEHL</scope><scope>PPXIY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0002-4669-7013</orcidid><orcidid>https://orcid.org/0000-0002-0398-8398</orcidid><orcidid>https://orcid.org/0000-0003-0241-5376</orcidid></search><sort><creationdate>20180808</creationdate><title>Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study</title><author>Quiroz, Juan Carlos ; Geangu, Elena ; Yong, Min Hooi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c457t-b810df0c0bca9a8ae69420e55f52c64ffde3d156a71e7b1fae797cac8c376d5d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Accelerometers</topic><topic>Cellular telephones</topic><topic>Emotions</topic><topic>Happiness</topic><topic>Movement</topic><topic>Original Paper</topic><topic>Sensors</topic><topic>Walking</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Quiroz, Juan Carlos</creatorcontrib><creatorcontrib>Geangu, Elena</creatorcontrib><creatorcontrib>Yong, Min Hooi</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Psychology Database (Alumni)</collection><collection>Public Health Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Psychology Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest Health &amp; Medical Research Collection</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Health &amp; Nursing</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>JMIR mental health</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Quiroz, Juan Carlos</au><au>Geangu, Elena</au><au>Yong, Min Hooi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study</atitle><jtitle>JMIR mental health</jtitle><addtitle>JMIR Ment Health</addtitle><date>2018-08-08</date><risdate>2018</risdate><volume>5</volume><issue>3</issue><spage>e10153</spage><epage>e10153</epage><pages>e10153-e10153</pages><issn>2368-7959</issn><eissn>2368-7959</eissn><abstract>Research in psychology has shown that the way a person walks reflects that person's current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual's emotional state. We present our findings of a user study with 50 participants. The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual "movie clips" and audio "music clips"). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual's emotion. Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P&lt;.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.</abstract><cop>Canada</cop><pub>JMIR Publications</pub><pmid>30089610</pmid><doi>10.2196/10153</doi><orcidid>https://orcid.org/0000-0002-4669-7013</orcidid><orcidid>https://orcid.org/0000-0002-0398-8398</orcidid><orcidid>https://orcid.org/0000-0003-0241-5376</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2368-7959
ispartof JMIR mental health, 2018-08, Vol.5 (3), p.e10153-e10153
issn 2368-7959
2368-7959
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_6105867
source DOAJ Directory of Open Access Journals; EZB-FREE-00999 freely available EZB journals; PubMed Central; PubMed Central Open Access
subjects Accelerometers
Cellular telephones
Emotions
Happiness
Movement
Original Paper
Sensors
Walking
title Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-21T17%3A20%3A11IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Emotion%20Recognition%20Using%20Smart%20Watch%20Sensor%20Data:%20Mixed-Design%20Study&rft.jtitle=JMIR%20mental%20health&rft.au=Quiroz,%20Juan%20Carlos&rft.date=2018-08-08&rft.volume=5&rft.issue=3&rft.spage=e10153&rft.epage=e10153&rft.pages=e10153-e10153&rft.issn=2368-7959&rft.eissn=2368-7959&rft_id=info:doi/10.2196/10153&rft_dat=%3Cproquest_pubme%3E2086260269%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2516970801&rft_id=info:pmid/30089610&rfr_iscdi=true