Emotion classification based on forehead biosignals using support vector machines in music listening

The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Naji, M., Firoozabadi, M., Azadfallah, P.
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 400
container_issue
container_start_page 396
container_title
container_volume
creator Naji, M.
Firoozabadi, M.
Azadfallah, P.
description The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal and valence classifiers. Relative powers of EEG sub-bands, spectral entropy, mean power frequency, and higher order crossings were extracted from each of the three forehead data channels: left Temporalis, Frontalis, and right Temporalis. The inputs of the classifiers were obtained by a feature selection algorithm based on a fuzzy-rough model. The averaged subject-independent classification accuracy of 93.80%, 92.43%, and 86.67% for arousal classification, valence classification, and classification of four emotional states in Arousal-Valence space, respectively, is achieved.
doi_str_mv 10.1109/BIBE.2012.6399657
format Conference Proceeding
fullrecord <record><control><sourceid>ieee_6IE</sourceid><recordid>TN_cdi_ieee_primary_6399657</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>6399657</ieee_id><sourcerecordid>6399657</sourcerecordid><originalsourceid>FETCH-LOGICAL-i241t-b735cbbb22ebb1ba8ab8ba59aed6e68dcd86f9e6cd1da188cf4984870bfb233a3</originalsourceid><addsrcrecordid>eNo1kMtKA0EURFtEUGM-QNz0DyT2Y6YfSxNGDQTc6Drc230naZlHmJ4I_r2DxlVVwalaFGP3UiylFP5xtVlVSyWkWhrtvSntBbuVhbG60KWzl2zurfvPVl2zec6fQggpdGGMv2Gxavsx9R0PDeSc6hTgNyJkinwydT_QgSByTH1O-w6azE85dXueT8djP4z8i8LYD7yFcEgdZZ463k5E4E3KI3UTeseu6qlH87PO2Mdz9b5-XWzfXjbrp-0iqUKOC7S6DIioFCFKBAfoEEoPFA0ZF0N0pvZkQpQRpHOhLrwrnBVYo9Ia9Iw9_O0mItodh9TC8L07H6N_AEHeWng</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Emotion classification based on forehead biosignals using support vector machines in music listening</title><source>IEEE Electronic Library (IEL) Conference Proceedings</source><creator>Naji, M. ; Firoozabadi, M. ; Azadfallah, P.</creator><creatorcontrib>Naji, M. ; Firoozabadi, M. ; Azadfallah, P.</creatorcontrib><description>The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal and valence classifiers. Relative powers of EEG sub-bands, spectral entropy, mean power frequency, and higher order crossings were extracted from each of the three forehead data channels: left Temporalis, Frontalis, and right Temporalis. The inputs of the classifiers were obtained by a feature selection algorithm based on a fuzzy-rough model. The averaged subject-independent classification accuracy of 93.80%, 92.43%, and 86.67% for arousal classification, valence classification, and classification of four emotional states in Arousal-Valence space, respectively, is achieved.</description><identifier>ISBN: 9781467343572</identifier><identifier>ISBN: 1467343579</identifier><identifier>EISBN: 1467343587</identifier><identifier>EISBN: 9781467343565</identifier><identifier>EISBN: 1467343560</identifier><identifier>EISBN: 9781467343589</identifier><identifier>DOI: 10.1109/BIBE.2012.6399657</identifier><language>eng</language><publisher>IEEE</publisher><subject>Accuracy ; arousal ; Electrodes ; Electroencephalography ; emotion classification ; Emotion recognition ; Feature extraction ; Forehead ; forehead biosignals ; Support vector machines ; valence</subject><ispartof>2012 IEEE 12th International Conference on Bioinformatics &amp; Bioengineering (BIBE), 2012, p.396-400</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/6399657$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>310,311,781,785,790,791,2059,27927,54922</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/6399657$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Naji, M.</creatorcontrib><creatorcontrib>Firoozabadi, M.</creatorcontrib><creatorcontrib>Azadfallah, P.</creatorcontrib><title>Emotion classification based on forehead biosignals using support vector machines in music listening</title><title>2012 IEEE 12th International Conference on Bioinformatics &amp; Bioengineering (BIBE)</title><addtitle>BIBE</addtitle><description>The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal and valence classifiers. Relative powers of EEG sub-bands, spectral entropy, mean power frequency, and higher order crossings were extracted from each of the three forehead data channels: left Temporalis, Frontalis, and right Temporalis. The inputs of the classifiers were obtained by a feature selection algorithm based on a fuzzy-rough model. The averaged subject-independent classification accuracy of 93.80%, 92.43%, and 86.67% for arousal classification, valence classification, and classification of four emotional states in Arousal-Valence space, respectively, is achieved.</description><subject>Accuracy</subject><subject>arousal</subject><subject>Electrodes</subject><subject>Electroencephalography</subject><subject>emotion classification</subject><subject>Emotion recognition</subject><subject>Feature extraction</subject><subject>Forehead</subject><subject>forehead biosignals</subject><subject>Support vector machines</subject><subject>valence</subject><isbn>9781467343572</isbn><isbn>1467343579</isbn><isbn>1467343587</isbn><isbn>9781467343565</isbn><isbn>1467343560</isbn><isbn>9781467343589</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2012</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><sourceid>RIE</sourceid><recordid>eNo1kMtKA0EURFtEUGM-QNz0DyT2Y6YfSxNGDQTc6Drc230naZlHmJ4I_r2DxlVVwalaFGP3UiylFP5xtVlVSyWkWhrtvSntBbuVhbG60KWzl2zurfvPVl2zec6fQggpdGGMv2Gxavsx9R0PDeSc6hTgNyJkinwydT_QgSByTH1O-w6azE85dXueT8djP4z8i8LYD7yFcEgdZZ463k5E4E3KI3UTeseu6qlH87PO2Mdz9b5-XWzfXjbrp-0iqUKOC7S6DIioFCFKBAfoEEoPFA0ZF0N0pvZkQpQRpHOhLrwrnBVYo9Ia9Iw9_O0mItodh9TC8L07H6N_AEHeWng</recordid><startdate>201211</startdate><enddate>201211</enddate><creator>Naji, M.</creator><creator>Firoozabadi, M.</creator><creator>Azadfallah, P.</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>201211</creationdate><title>Emotion classification based on forehead biosignals using support vector machines in music listening</title><author>Naji, M. ; Firoozabadi, M. ; Azadfallah, P.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i241t-b735cbbb22ebb1ba8ab8ba59aed6e68dcd86f9e6cd1da188cf4984870bfb233a3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2012</creationdate><topic>Accuracy</topic><topic>arousal</topic><topic>Electrodes</topic><topic>Electroencephalography</topic><topic>emotion classification</topic><topic>Emotion recognition</topic><topic>Feature extraction</topic><topic>Forehead</topic><topic>forehead biosignals</topic><topic>Support vector machines</topic><topic>valence</topic><toplevel>online_resources</toplevel><creatorcontrib>Naji, M.</creatorcontrib><creatorcontrib>Firoozabadi, M.</creatorcontrib><creatorcontrib>Azadfallah, P.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Naji, M.</au><au>Firoozabadi, M.</au><au>Azadfallah, P.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Emotion classification based on forehead biosignals using support vector machines in music listening</atitle><btitle>2012 IEEE 12th International Conference on Bioinformatics &amp; Bioengineering (BIBE)</btitle><stitle>BIBE</stitle><date>2012-11</date><risdate>2012</risdate><spage>396</spage><epage>400</epage><pages>396-400</pages><isbn>9781467343572</isbn><isbn>1467343579</isbn><eisbn>1467343587</eisbn><eisbn>9781467343565</eisbn><eisbn>1467343560</eisbn><eisbn>9781467343589</eisbn><abstract>The purpose of this study was to investigate the feasibility of using forehead biosignals as informative channels for classification of music-induced emotions. Classification of four emotional states in Arousal-Valence space was performed by employing two parallel support vector machines as arousal and valence classifiers. Relative powers of EEG sub-bands, spectral entropy, mean power frequency, and higher order crossings were extracted from each of the three forehead data channels: left Temporalis, Frontalis, and right Temporalis. The inputs of the classifiers were obtained by a feature selection algorithm based on a fuzzy-rough model. The averaged subject-independent classification accuracy of 93.80%, 92.43%, and 86.67% for arousal classification, valence classification, and classification of four emotional states in Arousal-Valence space, respectively, is achieved.</abstract><pub>IEEE</pub><doi>10.1109/BIBE.2012.6399657</doi><tpages>5</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier ISBN: 9781467343572
ispartof 2012 IEEE 12th International Conference on Bioinformatics & Bioengineering (BIBE), 2012, p.396-400
issn
language eng
recordid cdi_ieee_primary_6399657
source IEEE Electronic Library (IEL) Conference Proceedings
subjects Accuracy
arousal
Electrodes
Electroencephalography
emotion classification
Emotion recognition
Feature extraction
Forehead
forehead biosignals
Support vector machines
valence
title Emotion classification based on forehead biosignals using support vector machines in music listening
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-18T10%3A49%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_6IE&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Emotion%20classification%20based%20on%20forehead%20biosignals%20using%20support%20vector%20machines%20in%20music%20listening&rft.btitle=2012%20IEEE%2012th%20International%20Conference%20on%20Bioinformatics%20&%20Bioengineering%20(BIBE)&rft.au=Naji,%20M.&rft.date=2012-11&rft.spage=396&rft.epage=400&rft.pages=396-400&rft.isbn=9781467343572&rft.isbn_list=1467343579&rft_id=info:doi/10.1109/BIBE.2012.6399657&rft_dat=%3Cieee_6IE%3E6399657%3C/ieee_6IE%3E%3Curl%3E%3C/url%3E&rft.eisbn=1467343587&rft.eisbn_list=9781467343565&rft.eisbn_list=1467343560&rft.eisbn_list=9781467343589&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=6399657&rfr_iscdi=true