On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications
Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the Internatio...
Gespeichert in:
Veröffentlicht in: | IEEE journal of biomedical and health informatics 2010-03, Vol.14 (2), p.309-318 |
---|---|
Hauptverfasser: | , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 318 |
---|---|
container_issue | 2 |
container_start_page | 309 |
container_title | IEEE journal of biomedical and health informatics |
container_volume | 14 |
creator | Frantzidis, C.A. Bratsas, C. Klados, M.A. Konstantinidis, E. Lithari, C.D. Vivas, A.B. Papadelis, C.L. Kaldoudi, E. Pappas, C. Bamidis, P.D. |
description | Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the International Affective Picture System (IAPS). Biosignals are multichannel recordings from both the central and the autonomic nervous systems. Following the bidirectional emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and arousal. Following this model, biosignals in this paper are initially differentiated according to their valence dimension by means of a data mining approach, which is the C4.5 decision tree algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis distance classifier, which dissects the data into high and low arousing. Results are described in Extensible Markup Language (XML) format, thereby accounting for platform independency, easy interconnectivity, and information exchange. The average recognition (success) rate was 77.68% for the discrimination of four emotional states, differing both in their arousal and valence dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient discrimination of negative and positive emotions, and it is hereby discussed how future developments may be steered to serve for affective healthcare applications, such as the monitoring of the elderly or chronically ill people. |
doi_str_mv | 10.1109/TITB.2009.2038481 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_pubmed_primary_20064762</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>5373931</ieee_id><sourcerecordid>734030054</sourcerecordid><originalsourceid>FETCH-LOGICAL-c446t-5ccfa0dd0203b91e742bd6617109c1169300bef0e00019f42280363dead2f8d33</originalsourceid><addsrcrecordid>eNqFkc-O0zAQxiMEYpeFB0BIyBIHTlnGsRMn3NpS2EqLlkOBY-Q649ZLGndtZxHPwssyUcseuHCxRzO_-aPvy7KXHC45h-bderWeXxYADT2iljV_lJ3zsqxzAFE8phjqJldK8bPsWYy3AFyWXDzNzqinkqoqzrPfNwNLO2SLXsforDM6OT8wb9ly76dQ92zufHRbiiJb3vsf2LHvO9cj--bwpxu2bGYtmuTukX1xJo0B43s2G9hqSLgNOhH_QSedf3YD0flcR8rMDofgtdkx6wO7Qt2nndEBp3x_OiI-z55YWoovTv9F9vXjcr24yq9vPq0Ws-vcSFmlvDTGaug6IBE2DUcli01XVVyRRIbzqhEAG7SAQAI0VhZFDaISHequsHUnxEX29jiXTrobMaZ276LBvtcD-jG2itYUZcXh_6SQQNtKSeSbf8hbP4ZJw5ZDobiUUCqi-JEywccY0LaH4PY6_CKonSxuJ4vbyeL2ZDH1vD5NHjd77B46_npKwKsj4BDxoVwKJRrBxR9-9Kpp</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1027144057</pqid></control><display><type>article</type><title>On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications</title><source>IEEE Electronic Library (IEL)</source><creator>Frantzidis, C.A. ; Bratsas, C. ; Klados, M.A. ; Konstantinidis, E. ; Lithari, C.D. ; Vivas, A.B. ; Papadelis, C.L. ; Kaldoudi, E. ; Pappas, C. ; Bamidis, P.D.</creator><creatorcontrib>Frantzidis, C.A. ; Bratsas, C. ; Klados, M.A. ; Konstantinidis, E. ; Lithari, C.D. ; Vivas, A.B. ; Papadelis, C.L. ; Kaldoudi, E. ; Pappas, C. ; Bamidis, P.D.</creatorcontrib><description>Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the International Affective Picture System (IAPS). Biosignals are multichannel recordings from both the central and the autonomic nervous systems. Following the bidirectional emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and arousal. Following this model, biosignals in this paper are initially differentiated according to their valence dimension by means of a data mining approach, which is the C4.5 decision tree algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis distance classifier, which dissects the data into high and low arousing. Results are described in Extensible Markup Language (XML) format, thereby accounting for platform independency, easy interconnectivity, and information exchange. The average recognition (success) rate was 77.68% for the discrimination of four emotional states, differing both in their arousal and valence dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient discrimination of negative and positive emotions, and it is hereby discussed how future developments may be steered to serve for affective healthcare applications, such as the monitoring of the elderly or chronically ill people.</description><identifier>ISSN: 1089-7771</identifier><identifier>ISSN: 2168-2194</identifier><identifier>EISSN: 1558-0032</identifier><identifier>EISSN: 2168-2208</identifier><identifier>DOI: 10.1109/TITB.2009.2038481</identifier><identifier>PMID: 20064762</identifier><identifier>CODEN: ITIBFX</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Adult ; Affective computing ; Algorithms ; Autonomic nervous system ; Autonomic Nervous System - physiology ; Biomedical monitoring ; Central Nervous System - physiology ; Data Mining ; decision tree ; Decision trees ; EEG ; Electroencephalography ; Emotion recognition ; emotion theory ; Emotions ; Emotions - physiology ; evoked potential response ; Evoked Potentials - physiology ; Female ; Galvanic Skin Response ; healthcare remote monitoring ; Humans ; International Affective Picture System (IAPS) ; LAN interconnection ; Mahalanobis distance ; Male ; Medical services ; Monitoring, Physiologic - methods ; Neuroscience ; Pattern Recognition, Automated ; Recognition (Psychology) - physiology ; Reproducibility of Results ; Robustness ; Signal Processing, Computer-Assisted ; Studies ; XML</subject><ispartof>IEEE journal of biomedical and health informatics, 2010-03, Vol.14 (2), p.309-318</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Mar 2010</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c446t-5ccfa0dd0203b91e742bd6617109c1169300bef0e00019f42280363dead2f8d33</citedby><cites>FETCH-LOGICAL-c446t-5ccfa0dd0203b91e742bd6617109c1169300bef0e00019f42280363dead2f8d33</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/5373931$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/5373931$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/20064762$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Frantzidis, C.A.</creatorcontrib><creatorcontrib>Bratsas, C.</creatorcontrib><creatorcontrib>Klados, M.A.</creatorcontrib><creatorcontrib>Konstantinidis, E.</creatorcontrib><creatorcontrib>Lithari, C.D.</creatorcontrib><creatorcontrib>Vivas, A.B.</creatorcontrib><creatorcontrib>Papadelis, C.L.</creatorcontrib><creatorcontrib>Kaldoudi, E.</creatorcontrib><creatorcontrib>Pappas, C.</creatorcontrib><creatorcontrib>Bamidis, P.D.</creatorcontrib><title>On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications</title><title>IEEE journal of biomedical and health informatics</title><addtitle>TITB</addtitle><addtitle>IEEE Trans Inf Technol Biomed</addtitle><description>Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the International Affective Picture System (IAPS). Biosignals are multichannel recordings from both the central and the autonomic nervous systems. Following the bidirectional emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and arousal. Following this model, biosignals in this paper are initially differentiated according to their valence dimension by means of a data mining approach, which is the C4.5 decision tree algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis distance classifier, which dissects the data into high and low arousing. Results are described in Extensible Markup Language (XML) format, thereby accounting for platform independency, easy interconnectivity, and information exchange. The average recognition (success) rate was 77.68% for the discrimination of four emotional states, differing both in their arousal and valence dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient discrimination of negative and positive emotions, and it is hereby discussed how future developments may be steered to serve for affective healthcare applications, such as the monitoring of the elderly or chronically ill people.</description><subject>Adult</subject><subject>Affective computing</subject><subject>Algorithms</subject><subject>Autonomic nervous system</subject><subject>Autonomic Nervous System - physiology</subject><subject>Biomedical monitoring</subject><subject>Central Nervous System - physiology</subject><subject>Data Mining</subject><subject>decision tree</subject><subject>Decision trees</subject><subject>EEG</subject><subject>Electroencephalography</subject><subject>Emotion recognition</subject><subject>emotion theory</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>evoked potential response</subject><subject>Evoked Potentials - physiology</subject><subject>Female</subject><subject>Galvanic Skin Response</subject><subject>healthcare remote monitoring</subject><subject>Humans</subject><subject>International Affective Picture System (IAPS)</subject><subject>LAN interconnection</subject><subject>Mahalanobis distance</subject><subject>Male</subject><subject>Medical services</subject><subject>Monitoring, Physiologic - methods</subject><subject>Neuroscience</subject><subject>Pattern Recognition, Automated</subject><subject>Recognition (Psychology) - physiology</subject><subject>Reproducibility of Results</subject><subject>Robustness</subject><subject>Signal Processing, Computer-Assisted</subject><subject>Studies</subject><subject>XML</subject><issn>1089-7771</issn><issn>2168-2194</issn><issn>1558-0032</issn><issn>2168-2208</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2010</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><sourceid>EIF</sourceid><recordid>eNqFkc-O0zAQxiMEYpeFB0BIyBIHTlnGsRMn3NpS2EqLlkOBY-Q649ZLGndtZxHPwssyUcseuHCxRzO_-aPvy7KXHC45h-bderWeXxYADT2iljV_lJ3zsqxzAFE8phjqJldK8bPsWYy3AFyWXDzNzqinkqoqzrPfNwNLO2SLXsforDM6OT8wb9ly76dQ92zufHRbiiJb3vsf2LHvO9cj--bwpxu2bGYtmuTukX1xJo0B43s2G9hqSLgNOhH_QSedf3YD0flcR8rMDofgtdkx6wO7Qt2nndEBp3x_OiI-z55YWoovTv9F9vXjcr24yq9vPq0Ws-vcSFmlvDTGaug6IBE2DUcli01XVVyRRIbzqhEAG7SAQAI0VhZFDaISHequsHUnxEX29jiXTrobMaZ276LBvtcD-jG2itYUZcXh_6SQQNtKSeSbf8hbP4ZJw5ZDobiUUCqi-JEywccY0LaH4PY6_CKonSxuJ4vbyeL2ZDH1vD5NHjd77B46_npKwKsj4BDxoVwKJRrBxR9-9Kpp</recordid><startdate>201003</startdate><enddate>201003</enddate><creator>Frantzidis, C.A.</creator><creator>Bratsas, C.</creator><creator>Klados, M.A.</creator><creator>Konstantinidis, E.</creator><creator>Lithari, C.D.</creator><creator>Vivas, A.B.</creator><creator>Papadelis, C.L.</creator><creator>Kaldoudi, E.</creator><creator>Pappas, C.</creator><creator>Bamidis, P.D.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>K9.</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><scope>7TK</scope></search><sort><creationdate>201003</creationdate><title>On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications</title><author>Frantzidis, C.A. ; Bratsas, C. ; Klados, M.A. ; Konstantinidis, E. ; Lithari, C.D. ; Vivas, A.B. ; Papadelis, C.L. ; Kaldoudi, E. ; Pappas, C. ; Bamidis, P.D.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c446t-5ccfa0dd0203b91e742bd6617109c1169300bef0e00019f42280363dead2f8d33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2010</creationdate><topic>Adult</topic><topic>Affective computing</topic><topic>Algorithms</topic><topic>Autonomic nervous system</topic><topic>Autonomic Nervous System - physiology</topic><topic>Biomedical monitoring</topic><topic>Central Nervous System - physiology</topic><topic>Data Mining</topic><topic>decision tree</topic><topic>Decision trees</topic><topic>EEG</topic><topic>Electroencephalography</topic><topic>Emotion recognition</topic><topic>emotion theory</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>evoked potential response</topic><topic>Evoked Potentials - physiology</topic><topic>Female</topic><topic>Galvanic Skin Response</topic><topic>healthcare remote monitoring</topic><topic>Humans</topic><topic>International Affective Picture System (IAPS)</topic><topic>LAN interconnection</topic><topic>Mahalanobis distance</topic><topic>Male</topic><topic>Medical services</topic><topic>Monitoring, Physiologic - methods</topic><topic>Neuroscience</topic><topic>Pattern Recognition, Automated</topic><topic>Recognition (Psychology) - physiology</topic><topic>Reproducibility of Results</topic><topic>Robustness</topic><topic>Signal Processing, Computer-Assisted</topic><topic>Studies</topic><topic>XML</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Frantzidis, C.A.</creatorcontrib><creatorcontrib>Bratsas, C.</creatorcontrib><creatorcontrib>Klados, M.A.</creatorcontrib><creatorcontrib>Konstantinidis, E.</creatorcontrib><creatorcontrib>Lithari, C.D.</creatorcontrib><creatorcontrib>Vivas, A.B.</creatorcontrib><creatorcontrib>Papadelis, C.L.</creatorcontrib><creatorcontrib>Kaldoudi, E.</creatorcontrib><creatorcontrib>Pappas, C.</creatorcontrib><creatorcontrib>Bamidis, P.D.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>Neurosciences Abstracts</collection><jtitle>IEEE journal of biomedical and health informatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Frantzidis, C.A.</au><au>Bratsas, C.</au><au>Klados, M.A.</au><au>Konstantinidis, E.</au><au>Lithari, C.D.</au><au>Vivas, A.B.</au><au>Papadelis, C.L.</au><au>Kaldoudi, E.</au><au>Pappas, C.</au><au>Bamidis, P.D.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications</atitle><jtitle>IEEE journal of biomedical and health informatics</jtitle><stitle>TITB</stitle><addtitle>IEEE Trans Inf Technol Biomed</addtitle><date>2010-03</date><risdate>2010</risdate><volume>14</volume><issue>2</issue><spage>309</spage><epage>318</epage><pages>309-318</pages><issn>1089-7771</issn><issn>2168-2194</issn><eissn>1558-0032</eissn><eissn>2168-2208</eissn><coden>ITIBFX</coden><abstract>Recent neuroscience findings demonstrate the fundamental role of emotion in the maintenance of physical and mental health. In the present study, a novel architecture is proposed for the robust discrimination of emotional physiological signals evoked upon viewing pictures selected from the International Affective Picture System (IAPS). Biosignals are multichannel recordings from both the central and the autonomic nervous systems. Following the bidirectional emotion theory model, IAPS pictures are rated along two dimensions, namely, their valence and arousal. Following this model, biosignals in this paper are initially differentiated according to their valence dimension by means of a data mining approach, which is the C4.5 decision tree algorithm. Then, the valence and the gender information serve as an input to a Mahalanobis distance classifier, which dissects the data into high and low arousing. Results are described in Extensible Markup Language (XML) format, thereby accounting for platform independency, easy interconnectivity, and information exchange. The average recognition (success) rate was 77.68% for the discrimination of four emotional states, differing both in their arousal and valence dimension. It is, therefore, envisaged that the proposed approach holds promise for the efficient discrimination of negative and positive emotions, and it is hereby discussed how future developments may be steered to serve for affective healthcare applications, such as the monitoring of the elderly or chronically ill people.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>20064762</pmid><doi>10.1109/TITB.2009.2038481</doi><tpages>10</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 1089-7771 |
ispartof | IEEE journal of biomedical and health informatics, 2010-03, Vol.14 (2), p.309-318 |
issn | 1089-7771 2168-2194 1558-0032 2168-2208 |
language | eng |
recordid | cdi_pubmed_primary_20064762 |
source | IEEE Electronic Library (IEL) |
subjects | Adult Affective computing Algorithms Autonomic nervous system Autonomic Nervous System - physiology Biomedical monitoring Central Nervous System - physiology Data Mining decision tree Decision trees EEG Electroencephalography Emotion recognition emotion theory Emotions Emotions - physiology evoked potential response Evoked Potentials - physiology Female Galvanic Skin Response healthcare remote monitoring Humans International Affective Picture System (IAPS) LAN interconnection Mahalanobis distance Male Medical services Monitoring, Physiologic - methods Neuroscience Pattern Recognition, Automated Recognition (Psychology) - physiology Reproducibility of Results Robustness Signal Processing, Computer-Assisted Studies XML |
title | On the Classification of Emotional Biosignals Evoked While Viewing Affective Pictures: An Integrated Data-Mining-Based Approach for Healthcare Applications |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T11%3A46%3A32IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=On%20the%20Classification%20of%20Emotional%20Biosignals%20Evoked%20While%20Viewing%20Affective%20Pictures:%20An%20Integrated%20Data-Mining-Based%20Approach%20for%20Healthcare%20Applications&rft.jtitle=IEEE%20journal%20of%20biomedical%20and%20health%20informatics&rft.au=Frantzidis,%20C.A.&rft.date=2010-03&rft.volume=14&rft.issue=2&rft.spage=309&rft.epage=318&rft.pages=309-318&rft.issn=1089-7771&rft.eissn=1558-0032&rft.coden=ITIBFX&rft_id=info:doi/10.1109/TITB.2009.2038481&rft_dat=%3Cproquest_RIE%3E734030054%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1027144057&rft_id=info:pmid/20064762&rft_ieee_id=5373931&rfr_iscdi=true |