Convolution spatial-temporal attention network for EEG emotion recognition

In recent years, emotion recognition using electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Physiological measurement 2024-12, Vol.45 (12), p.125003
Hauptverfasser: Cao, Lei, Yu, Binlong, Dong, Yilin, Liu, Tianyu, Li, Jie
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 12
container_start_page 125003
container_title Physiological measurement
container_volume 45
creator Cao, Lei
Yu, Binlong
Dong, Yilin
Liu, Tianyu
Li, Jie
description In recent years, emotion recognition using electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and leveraging the topological relationships between channels to transform EEG signals from two-dimensional time sequences into three-dimensional spatio-temporal representations. Maximizing the potential of deep learning, our approach provides a data-driven and robust method for identifying emotional states. Leveraging the synergy between convolutional neural network and attention mechanisms facilitated automatic feature extraction and dynamic learning of inter-channel dependencies. Our method showcased remarkable performance in emotion recognition tasks, confirming the effectiveness of our approach, achieving average accuracy of 98.62% for arousal and 98.47% for valence, surpassing previous state-of-the-art results of 95.76% and 95.15%. Furthermore, we conducted a series of pivotal experiments that broadened the scope of emotion recognition research, exploring further possibilities in the field of emotion recognition.
doi_str_mv 10.1088/1361-6579/ad9661
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1088_1361_6579_ad9661</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3132141257</sourcerecordid><originalsourceid>FETCH-LOGICAL-c253t-43e456f84ca579ef387a90b33e5e3aa91a94bc664a85deda757a8f4baf0ff3613</originalsourceid><addsrcrecordid>eNp1kE1PwzAMQCMEYuPjzgn1CBJlSZM07RFNY4AmcYFz5LUO6mibkrQg_j3tOnYCKZIj-9mJHyEXjN4ymiQzxmMWxlKlM8jTOGYHZLpPHZIpTWMVcs7FhJx4v6GUsSSSx2TCU6kUTdWUPM1t_WnLri1sHfgG2gLKsMWqsQ7KANoW622pxvbLuvfAWBcsFssAK7vNO8zsW10M9zNyZKD0eL6Lp-T1fvEyfwhXz8vH-d0qzCLJ21BwFDI2icig_yUanihI6ZpzlMgBUgapWGdxLCCROeagpILEiDUYaky_HD8lV-PcxtmPDn2rq8JnWJZQo-285oxHTLBIqh6lI5o5671DoxtXVOC-NaN6MKgHXXrQpUeDfcvlbnq3rjDfN_wq64HrEShsoze2c3W_rG4qBC2kZlF_JKVcN7np2Zs_2H_f_gELgoiK</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3132141257</pqid></control><display><type>article</type><title>Convolution spatial-temporal attention network for EEG emotion recognition</title><source>MEDLINE</source><source>IOP Publishing Journals</source><source>Institute of Physics (IOP) Journals - HEAL-Link</source><creator>Cao, Lei ; Yu, Binlong ; Dong, Yilin ; Liu, Tianyu ; Li, Jie</creator><creatorcontrib>Cao, Lei ; Yu, Binlong ; Dong, Yilin ; Liu, Tianyu ; Li, Jie</creatorcontrib><description>In recent years, emotion recognition using electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and leveraging the topological relationships between channels to transform EEG signals from two-dimensional time sequences into three-dimensional spatio-temporal representations. Maximizing the potential of deep learning, our approach provides a data-driven and robust method for identifying emotional states. Leveraging the synergy between convolutional neural network and attention mechanisms facilitated automatic feature extraction and dynamic learning of inter-channel dependencies. Our method showcased remarkable performance in emotion recognition tasks, confirming the effectiveness of our approach, achieving average accuracy of 98.62% for arousal and 98.47% for valence, surpassing previous state-of-the-art results of 95.76% and 95.15%. Furthermore, we conducted a series of pivotal experiments that broadened the scope of emotion recognition research, exploring further possibilities in the field of emotion recognition.</description><identifier>ISSN: 0967-3334</identifier><identifier>ISSN: 1361-6579</identifier><identifier>EISSN: 1361-6579</identifier><identifier>DOI: 10.1088/1361-6579/ad9661</identifier><identifier>PMID: 39577097</identifier><identifier>CODEN: PMEAE3</identifier><language>eng</language><publisher>England: IOP Publishing</publisher><subject>Adult ; Attention - physiology ; attention mechanisms ; CNN ; data preprocessing ; EEG ; Electroencephalography ; emotion recognition ; Emotions - physiology ; Female ; Humans ; Male ; Neural Networks, Computer ; Signal Processing, Computer-Assisted ; Time Factors ; Young Adult</subject><ispartof>Physiological measurement, 2024-12, Vol.45 (12), p.125003</ispartof><rights>2024 Institute of Physics and Engineering in Medicine. All rights, including for text and data mining, AI training, and similar technologies, are reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c253t-43e456f84ca579ef387a90b33e5e3aa91a94bc664a85deda757a8f4baf0ff3613</cites><orcidid>0000-0002-8427-4548 ; 0009-0004-6964-9083</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1361-6579/ad9661/pdf$$EPDF$$P50$$Giop$$H</linktopdf><link.rule.ids>314,776,780,27903,27904,53825,53872</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/39577097$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Cao, Lei</creatorcontrib><creatorcontrib>Yu, Binlong</creatorcontrib><creatorcontrib>Dong, Yilin</creatorcontrib><creatorcontrib>Liu, Tianyu</creatorcontrib><creatorcontrib>Li, Jie</creatorcontrib><title>Convolution spatial-temporal attention network for EEG emotion recognition</title><title>Physiological measurement</title><addtitle>PM</addtitle><addtitle>Physiol. Meas</addtitle><description>In recent years, emotion recognition using electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and leveraging the topological relationships between channels to transform EEG signals from two-dimensional time sequences into three-dimensional spatio-temporal representations. Maximizing the potential of deep learning, our approach provides a data-driven and robust method for identifying emotional states. Leveraging the synergy between convolutional neural network and attention mechanisms facilitated automatic feature extraction and dynamic learning of inter-channel dependencies. Our method showcased remarkable performance in emotion recognition tasks, confirming the effectiveness of our approach, achieving average accuracy of 98.62% for arousal and 98.47% for valence, surpassing previous state-of-the-art results of 95.76% and 95.15%. Furthermore, we conducted a series of pivotal experiments that broadened the scope of emotion recognition research, exploring further possibilities in the field of emotion recognition.</description><subject>Adult</subject><subject>Attention - physiology</subject><subject>attention mechanisms</subject><subject>CNN</subject><subject>data preprocessing</subject><subject>EEG</subject><subject>Electroencephalography</subject><subject>emotion recognition</subject><subject>Emotions - physiology</subject><subject>Female</subject><subject>Humans</subject><subject>Male</subject><subject>Neural Networks, Computer</subject><subject>Signal Processing, Computer-Assisted</subject><subject>Time Factors</subject><subject>Young Adult</subject><issn>0967-3334</issn><issn>1361-6579</issn><issn>1361-6579</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><recordid>eNp1kE1PwzAMQCMEYuPjzgn1CBJlSZM07RFNY4AmcYFz5LUO6mibkrQg_j3tOnYCKZIj-9mJHyEXjN4ymiQzxmMWxlKlM8jTOGYHZLpPHZIpTWMVcs7FhJx4v6GUsSSSx2TCU6kUTdWUPM1t_WnLri1sHfgG2gLKsMWqsQ7KANoW622pxvbLuvfAWBcsFssAK7vNO8zsW10M9zNyZKD0eL6Lp-T1fvEyfwhXz8vH-d0qzCLJ21BwFDI2icig_yUanihI6ZpzlMgBUgapWGdxLCCROeagpILEiDUYaky_HD8lV-PcxtmPDn2rq8JnWJZQo-285oxHTLBIqh6lI5o5671DoxtXVOC-NaN6MKgHXXrQpUeDfcvlbnq3rjDfN_wq64HrEShsoze2c3W_rG4qBC2kZlF_JKVcN7np2Zs_2H_f_gELgoiK</recordid><startdate>20241201</startdate><enddate>20241201</enddate><creator>Cao, Lei</creator><creator>Yu, Binlong</creator><creator>Dong, Yilin</creator><creator>Liu, Tianyu</creator><creator>Li, Jie</creator><general>IOP Publishing</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-8427-4548</orcidid><orcidid>https://orcid.org/0009-0004-6964-9083</orcidid></search><sort><creationdate>20241201</creationdate><title>Convolution spatial-temporal attention network for EEG emotion recognition</title><author>Cao, Lei ; Yu, Binlong ; Dong, Yilin ; Liu, Tianyu ; Li, Jie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c253t-43e456f84ca579ef387a90b33e5e3aa91a94bc664a85deda757a8f4baf0ff3613</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adult</topic><topic>Attention - physiology</topic><topic>attention mechanisms</topic><topic>CNN</topic><topic>data preprocessing</topic><topic>EEG</topic><topic>Electroencephalography</topic><topic>emotion recognition</topic><topic>Emotions - physiology</topic><topic>Female</topic><topic>Humans</topic><topic>Male</topic><topic>Neural Networks, Computer</topic><topic>Signal Processing, Computer-Assisted</topic><topic>Time Factors</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Cao, Lei</creatorcontrib><creatorcontrib>Yu, Binlong</creatorcontrib><creatorcontrib>Dong, Yilin</creatorcontrib><creatorcontrib>Liu, Tianyu</creatorcontrib><creatorcontrib>Li, Jie</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Physiological measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cao, Lei</au><au>Yu, Binlong</au><au>Dong, Yilin</au><au>Liu, Tianyu</au><au>Li, Jie</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Convolution spatial-temporal attention network for EEG emotion recognition</atitle><jtitle>Physiological measurement</jtitle><stitle>PM</stitle><addtitle>Physiol. Meas</addtitle><date>2024-12-01</date><risdate>2024</risdate><volume>45</volume><issue>12</issue><spage>125003</spage><pages>125003-</pages><issn>0967-3334</issn><issn>1361-6579</issn><eissn>1361-6579</eissn><coden>PMEAE3</coden><abstract>In recent years, emotion recognition using electroencephalogram (EEG) signals has garnered significant interest due to its non-invasive nature and high temporal resolution. We introduced a groundbreaking method that bypasses traditional manual feature engineering, emphasizing data preprocessing and leveraging the topological relationships between channels to transform EEG signals from two-dimensional time sequences into three-dimensional spatio-temporal representations. Maximizing the potential of deep learning, our approach provides a data-driven and robust method for identifying emotional states. Leveraging the synergy between convolutional neural network and attention mechanisms facilitated automatic feature extraction and dynamic learning of inter-channel dependencies. Our method showcased remarkable performance in emotion recognition tasks, confirming the effectiveness of our approach, achieving average accuracy of 98.62% for arousal and 98.47% for valence, surpassing previous state-of-the-art results of 95.76% and 95.15%. Furthermore, we conducted a series of pivotal experiments that broadened the scope of emotion recognition research, exploring further possibilities in the field of emotion recognition.</abstract><cop>England</cop><pub>IOP Publishing</pub><pmid>39577097</pmid><doi>10.1088/1361-6579/ad9661</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0002-8427-4548</orcidid><orcidid>https://orcid.org/0009-0004-6964-9083</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0967-3334
ispartof Physiological measurement, 2024-12, Vol.45 (12), p.125003
issn 0967-3334
1361-6579
1361-6579
language eng
recordid cdi_crossref_primary_10_1088_1361_6579_ad9661
source MEDLINE; IOP Publishing Journals; Institute of Physics (IOP) Journals - HEAL-Link
subjects Adult
Attention - physiology
attention mechanisms
CNN
data preprocessing
EEG
Electroencephalography
emotion recognition
Emotions - physiology
Female
Humans
Male
Neural Networks, Computer
Signal Processing, Computer-Assisted
Time Factors
Young Adult
title Convolution spatial-temporal attention network for EEG emotion recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T12%3A21%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Convolution%20spatial-temporal%20attention%20network%20for%20EEG%20emotion%20recognition&rft.jtitle=Physiological%20measurement&rft.au=Cao,%20Lei&rft.date=2024-12-01&rft.volume=45&rft.issue=12&rft.spage=125003&rft.pages=125003-&rft.issn=0967-3334&rft.eissn=1361-6579&rft.coden=PMEAE3&rft_id=info:doi/10.1088/1361-6579/ad9661&rft_dat=%3Cproquest_cross%3E3132141257%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3132141257&rft_id=info:pmid/39577097&rfr_iscdi=true