Spatial-temporal network for fine-grained-level emotion EEG recognition

Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of neural engineering 2022-06, Vol.19 (3), p.36017
Hauptverfasser: Ji, Youshuo, Li, Fu, Fu, Boxun, Li, Yang, Zhou, Yijin, Niu, Yi, Zhang, Lijian, Chen, Yuanfang, Shi, Guangming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue 3
container_start_page 36017
container_title Journal of neural engineering
container_volume 19
creator Ji, Youshuo
Li, Fu
Fu, Boxun
Li, Yang
Zhou, Yijin
Niu, Yi
Zhang, Lijian
Chen, Yuanfang
Shi, Guangming
description Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.
doi_str_mv 10.1088/1741-2552/ac6d7d
format Article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_2661077032</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2661077032</sourcerecordid><originalsourceid>FETCH-LOGICAL-c368t-87843b5f85977e9519106d16d66e91513c30c16d5ff1e9dc4fcc8790b6bcc2443</originalsourceid><addsrcrecordid>eNp9kMFPwyAUh4nRuDm9ezK96cE6KC2Uo1nmNFniQT0TSh9LZ1sqdBr_e2k6dzKeHg--98vjQ-iS4DuC83xOeEriJMuSudKs5OURmh6ujg9nhifozPstxpRwgU_RhIZ3ShIxRauXTvWVquMems46VUct9F_WvUfGushULcQbp0Ip4xo-oY6gsX1l22i5XEUOtN201dCfoxOjag8X-zpDbw_L18VjvH5ePS3u17GmLO_jnOcpLTKTZ4JzEBkRBLOSsJIxECQjVFOsQ5sZQ0CUOjVa52HnghVaJ2lKZ-hmzO2c_diB72VTeQ11rVqwOy8TxgjmHNMkoHhEtbPeOzCyc1Wj3LckWA765OBHDq7kqC-MXO3Td0UD5WHg11cAbkegsp3c2p1rw2f_y7v-A9-2IImQVGLKMOGyKw39AWS9hew</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2661077032</pqid></control><display><type>article</type><title>Spatial-temporal network for fine-grained-level emotion EEG recognition</title><source>IOP Publishing Journals</source><source>Institute of Physics (IOP) Journals - HEAL-Link</source><creator>Ji, Youshuo ; Li, Fu ; Fu, Boxun ; Li, Yang ; Zhou, Yijin ; Niu, Yi ; Zhang, Lijian ; Chen, Yuanfang ; Shi, Guangming</creator><creatorcontrib>Ji, Youshuo ; Li, Fu ; Fu, Boxun ; Li, Yang ; Zhou, Yijin ; Niu, Yi ; Zhang, Lijian ; Chen, Yuanfang ; Shi, Guangming</creatorcontrib><description>Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.</description><identifier>ISSN: 1741-2560</identifier><identifier>EISSN: 1741-2552</identifier><identifier>DOI: 10.1088/1741-2552/ac6d7d</identifier><identifier>PMID: 35523129</identifier><identifier>CODEN: JNEOBH</identifier><language>eng</language><publisher>England: IOP Publishing</publisher><subject>EEG-based emotion recognition ; emotion strength ; spatial-temporal network</subject><ispartof>Journal of neural engineering, 2022-06, Vol.19 (3), p.36017</ispartof><rights>2022 IOP Publishing Ltd</rights><rights>2022 IOP Publishing Ltd.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c368t-87843b5f85977e9519106d16d66e91513c30c16d5ff1e9dc4fcc8790b6bcc2443</citedby><cites>FETCH-LOGICAL-c368t-87843b5f85977e9519106d16d66e91513c30c16d5ff1e9dc4fcc8790b6bcc2443</cites><orcidid>0000-0003-0319-0308 ; 0000-0002-4441-0031 ; 0000-0002-5093-2151 ; 0000-0002-6802-0759</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://iopscience.iop.org/article/10.1088/1741-2552/ac6d7d/pdf$$EPDF$$P50$$Giop$$H</linktopdf><link.rule.ids>314,780,784,27923,27924,53845,53892</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35523129$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ji, Youshuo</creatorcontrib><creatorcontrib>Li, Fu</creatorcontrib><creatorcontrib>Fu, Boxun</creatorcontrib><creatorcontrib>Li, Yang</creatorcontrib><creatorcontrib>Zhou, Yijin</creatorcontrib><creatorcontrib>Niu, Yi</creatorcontrib><creatorcontrib>Zhang, Lijian</creatorcontrib><creatorcontrib>Chen, Yuanfang</creatorcontrib><creatorcontrib>Shi, Guangming</creatorcontrib><title>Spatial-temporal network for fine-grained-level emotion EEG recognition</title><title>Journal of neural engineering</title><addtitle>JNE</addtitle><addtitle>J. Neural Eng</addtitle><description>Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.</description><subject>EEG-based emotion recognition</subject><subject>emotion strength</subject><subject>spatial-temporal network</subject><issn>1741-2560</issn><issn>1741-2552</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kMFPwyAUh4nRuDm9ezK96cE6KC2Uo1nmNFniQT0TSh9LZ1sqdBr_e2k6dzKeHg--98vjQ-iS4DuC83xOeEriJMuSudKs5OURmh6ujg9nhifozPstxpRwgU_RhIZ3ShIxRauXTvWVquMems46VUct9F_WvUfGushULcQbp0Ip4xo-oY6gsX1l22i5XEUOtN201dCfoxOjag8X-zpDbw_L18VjvH5ePS3u17GmLO_jnOcpLTKTZ4JzEBkRBLOSsJIxECQjVFOsQ5sZQ0CUOjVa52HnghVaJ2lKZ-hmzO2c_diB72VTeQ11rVqwOy8TxgjmHNMkoHhEtbPeOzCyc1Wj3LckWA765OBHDq7kqC-MXO3Td0UD5WHg11cAbkegsp3c2p1rw2f_y7v-A9-2IImQVGLKMOGyKw39AWS9hew</recordid><startdate>20220601</startdate><enddate>20220601</enddate><creator>Ji, Youshuo</creator><creator>Li, Fu</creator><creator>Fu, Boxun</creator><creator>Li, Yang</creator><creator>Zhou, Yijin</creator><creator>Niu, Yi</creator><creator>Zhang, Lijian</creator><creator>Chen, Yuanfang</creator><creator>Shi, Guangming</creator><general>IOP Publishing</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-0319-0308</orcidid><orcidid>https://orcid.org/0000-0002-4441-0031</orcidid><orcidid>https://orcid.org/0000-0002-5093-2151</orcidid><orcidid>https://orcid.org/0000-0002-6802-0759</orcidid></search><sort><creationdate>20220601</creationdate><title>Spatial-temporal network for fine-grained-level emotion EEG recognition</title><author>Ji, Youshuo ; Li, Fu ; Fu, Boxun ; Li, Yang ; Zhou, Yijin ; Niu, Yi ; Zhang, Lijian ; Chen, Yuanfang ; Shi, Guangming</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c368t-87843b5f85977e9519106d16d66e91513c30c16d5ff1e9dc4fcc8790b6bcc2443</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>EEG-based emotion recognition</topic><topic>emotion strength</topic><topic>spatial-temporal network</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ji, Youshuo</creatorcontrib><creatorcontrib>Li, Fu</creatorcontrib><creatorcontrib>Fu, Boxun</creatorcontrib><creatorcontrib>Li, Yang</creatorcontrib><creatorcontrib>Zhou, Yijin</creatorcontrib><creatorcontrib>Niu, Yi</creatorcontrib><creatorcontrib>Zhang, Lijian</creatorcontrib><creatorcontrib>Chen, Yuanfang</creatorcontrib><creatorcontrib>Shi, Guangming</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Journal of neural engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ji, Youshuo</au><au>Li, Fu</au><au>Fu, Boxun</au><au>Li, Yang</au><au>Zhou, Yijin</au><au>Niu, Yi</au><au>Zhang, Lijian</au><au>Chen, Yuanfang</au><au>Shi, Guangming</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Spatial-temporal network for fine-grained-level emotion EEG recognition</atitle><jtitle>Journal of neural engineering</jtitle><stitle>JNE</stitle><addtitle>J. Neural Eng</addtitle><date>2022-06-01</date><risdate>2022</risdate><volume>19</volume><issue>3</issue><spage>36017</spage><pages>36017-</pages><issn>1741-2560</issn><eissn>1741-2552</eissn><coden>JNEOBH</coden><abstract>Electroencephalogram (EEG)-based affective computing brain-computer interfaces provide the capability for machines to understand human intentions. In practice, people are more concerned with the strength of a certain emotional state over a short period of time, which was called as fine-grained-level emotion in this paper. In this study, we built a fine-grained-level emotion EEG dataset that contains two coarse-grained emotions and four corresponding fine-grained-level emotions. To fully extract the features of the EEG signals, we proposed a corresponding fine-grained emotion EEG network (FG-emotionNet) for spatial-temporal feature extraction. Each feature extraction layer is linked to raw EEG signals to alleviate overfitting and ensure that the spatial features of each scale can be extracted from the raw signals. Moreover, all previous scale features are fused before the current spatial-feature layer to enhance the scale features in the spatial block. Additionally, long short-term memory is adopted as the temporal block to extract the temporal features based on spatial features and classify the category of fine-grained emotions. Subject-dependent and cross-session experiments demonstrated that the performance of the proposed method is superior to that of the representative methods in emotion recognition and similar structure methods with proposed method.</abstract><cop>England</cop><pub>IOP Publishing</pub><pmid>35523129</pmid><doi>10.1088/1741-2552/ac6d7d</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-0319-0308</orcidid><orcidid>https://orcid.org/0000-0002-4441-0031</orcidid><orcidid>https://orcid.org/0000-0002-5093-2151</orcidid><orcidid>https://orcid.org/0000-0002-6802-0759</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1741-2560
ispartof Journal of neural engineering, 2022-06, Vol.19 (3), p.36017
issn 1741-2560
1741-2552
language eng
recordid cdi_proquest_miscellaneous_2661077032
source IOP Publishing Journals; Institute of Physics (IOP) Journals - HEAL-Link
subjects EEG-based emotion recognition
emotion strength
spatial-temporal network
title Spatial-temporal network for fine-grained-level emotion EEG recognition
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T17%3A28%3A21IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Spatial-temporal%20network%20for%20fine-grained-level%20emotion%20EEG%20recognition&rft.jtitle=Journal%20of%20neural%20engineering&rft.au=Ji,%20Youshuo&rft.date=2022-06-01&rft.volume=19&rft.issue=3&rft.spage=36017&rft.pages=36017-&rft.issn=1741-2560&rft.eissn=1741-2552&rft.coden=JNEOBH&rft_id=info:doi/10.1088/1741-2552/ac6d7d&rft_dat=%3Cproquest_pubme%3E2661077032%3C/proquest_pubme%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2661077032&rft_id=info:pmid/35523129&rfr_iscdi=true