Convolutional multi-head self-attention on memory for aspect sentiment classification
This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic informat...
Gespeichert in:
Veröffentlicht in: | IEEE/CAA journal of automatica sinica 2020-07, Vol.7 (4), p.1038-1044 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1044 |
---|---|
container_issue | 4 |
container_start_page | 1038 |
container_title | IEEE/CAA journal of automatica sinica |
container_volume | 7 |
creator | Zhang, Yaojie Xu, Bing Zhao, Tiejun |
description | This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently. |
doi_str_mv | 10.1109/JAS.2020.1003243 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_ieee_primary_9128078</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9128078</ieee_id><sourcerecordid>2419495941</sourcerecordid><originalsourceid>FETCH-LOGICAL-c326t-b903f4a0184d08ef98121e7ea94b893d3fcfbf1d732e32ae0571cb10f863978b3</originalsourceid><addsrcrecordid>eNo9kElrwzAQhUVpoSHNvdCLoWeno8W2dAyhK4Ee2pyFbI-ogx2lklzIv69MQmCYhfnewDxC7iksKQX19LH6WjJgaQLgTPArMmOcqVyxSlxf-rK8JYsQdgBAWVGVSszIdu32f64fY-f2ps-GsY9d_oOmzQL2Njcx4n7aZSkGHJw_Ztb5zIQDNjExaTmklDW9CaGzXWMm-o7cWNMHXJzrnGxfnr_Xb_nm8_V9vdrkDWdlzGsF3AoDVIoWJFolKaNYoVGiloq33Da2trStOEPODEJR0aamYGXJVSVrPiePp7sH735HDFHv3OjTI0EzQZVQhRI0UXCiGu9C8Gj1wXeD8UdNQU_-6eSfnvzTZ_-S5OEk6RDxgivKJFSS_wPK1Gxi</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2419495941</pqid></control><display><type>article</type><title>Convolutional multi-head self-attention on memory for aspect sentiment classification</title><source>IEEE Electronic Library (IEL)</source><creator>Zhang, Yaojie ; Xu, Bing ; Zhao, Tiejun</creator><creatorcontrib>Zhang, Yaojie ; Xu, Bing ; Zhao, Tiejun</creatorcontrib><description><![CDATA[This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.]]></description><identifier>ISSN: 2329-9266</identifier><identifier>EISSN: 2329-9274</identifier><identifier>DOI: 10.1109/JAS.2020.1003243</identifier><identifier>CODEN: IJASJC</identifier><language>eng</language><publisher>Piscataway: Chinese Association of Automation (CAA)</publisher><subject>Classification ; Convolution ; Data mining ; Feature extraction ; Recurrent neural networks ; Semantics ; Sentiment analysis ; Sequences ; Short term ; Task analysis</subject><ispartof>IEEE/CAA journal of automatica sinica, 2020-07, Vol.7 (4), p.1038-1044</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c326t-b903f4a0184d08ef98121e7ea94b893d3fcfbf1d732e32ae0571cb10f863978b3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9128078$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9128078$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Zhang, Yaojie</creatorcontrib><creatorcontrib>Xu, Bing</creatorcontrib><creatorcontrib>Zhao, Tiejun</creatorcontrib><title>Convolutional multi-head self-attention on memory for aspect sentiment classification</title><title>IEEE/CAA journal of automatica sinica</title><addtitle>JAS</addtitle><description><![CDATA[This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.]]></description><subject>Classification</subject><subject>Convolution</subject><subject>Data mining</subject><subject>Feature extraction</subject><subject>Recurrent neural networks</subject><subject>Semantics</subject><subject>Sentiment analysis</subject><subject>Sequences</subject><subject>Short term</subject><subject>Task analysis</subject><issn>2329-9266</issn><issn>2329-9274</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNo9kElrwzAQhUVpoSHNvdCLoWeno8W2dAyhK4Ee2pyFbI-ogx2lklzIv69MQmCYhfnewDxC7iksKQX19LH6WjJgaQLgTPArMmOcqVyxSlxf-rK8JYsQdgBAWVGVSszIdu32f64fY-f2ps-GsY9d_oOmzQL2Njcx4n7aZSkGHJw_Ztb5zIQDNjExaTmklDW9CaGzXWMm-o7cWNMHXJzrnGxfnr_Xb_nm8_V9vdrkDWdlzGsF3AoDVIoWJFolKaNYoVGiloq33Da2trStOEPODEJR0aamYGXJVSVrPiePp7sH735HDFHv3OjTI0EzQZVQhRI0UXCiGu9C8Gj1wXeD8UdNQU_-6eSfnvzTZ_-S5OEk6RDxgivKJFSS_wPK1Gxi</recordid><startdate>20200701</startdate><enddate>20200701</enddate><creator>Zhang, Yaojie</creator><creator>Xu, Bing</creator><creator>Zhao, Tiejun</creator><general>Chinese Association of Automation (CAA)</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7TB</scope><scope>8FD</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20200701</creationdate><title>Convolutional multi-head self-attention on memory for aspect sentiment classification</title><author>Zhang, Yaojie ; Xu, Bing ; Zhao, Tiejun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c326t-b903f4a0184d08ef98121e7ea94b893d3fcfbf1d732e32ae0571cb10f863978b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Classification</topic><topic>Convolution</topic><topic>Data mining</topic><topic>Feature extraction</topic><topic>Recurrent neural networks</topic><topic>Semantics</topic><topic>Sentiment analysis</topic><topic>Sequences</topic><topic>Short term</topic><topic>Task analysis</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Yaojie</creatorcontrib><creatorcontrib>Xu, Bing</creatorcontrib><creatorcontrib>Zhao, Tiejun</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE/CAA journal of automatica sinica</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Yaojie</au><au>Xu, Bing</au><au>Zhao, Tiejun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Convolutional multi-head self-attention on memory for aspect sentiment classification</atitle><jtitle>IEEE/CAA journal of automatica sinica</jtitle><stitle>JAS</stitle><date>2020-07-01</date><risdate>2020</risdate><volume>7</volume><issue>4</issue><spage>1038</spage><epage>1044</epage><pages>1038-1044</pages><issn>2329-9266</issn><eissn>2329-9274</eissn><coden>IJASJC</coden><abstract><![CDATA[This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.]]></abstract><cop>Piscataway</cop><pub>Chinese Association of Automation (CAA)</pub><doi>10.1109/JAS.2020.1003243</doi><tpages>7</tpages></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2329-9266 |
ispartof | IEEE/CAA journal of automatica sinica, 2020-07, Vol.7 (4), p.1038-1044 |
issn | 2329-9266 2329-9274 |
language | eng |
recordid | cdi_ieee_primary_9128078 |
source | IEEE Electronic Library (IEL) |
subjects | Classification Convolution Data mining Feature extraction Recurrent neural networks Semantics Sentiment analysis Sequences Short term Task analysis |
title | Convolutional multi-head self-attention on memory for aspect sentiment classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T00%3A24%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Convolutional%20multi-head%20self-attention%20on%20memory%20for%20aspect%20sentiment%20classification&rft.jtitle=IEEE/CAA%20journal%20of%20automatica%20sinica&rft.au=Zhang,%20Yaojie&rft.date=2020-07-01&rft.volume=7&rft.issue=4&rft.spage=1038&rft.epage=1044&rft.pages=1038-1044&rft.issn=2329-9266&rft.eissn=2329-9274&rft.coden=IJASJC&rft_id=info:doi/10.1109/JAS.2020.1003243&rft_dat=%3Cproquest_RIE%3E2419495941%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2419495941&rft_id=info:pmid/&rft_ieee_id=9128078&rfr_iscdi=true |