Convolutional multi-head self-attention on memory for aspect sentiment classification

This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network &#x0028 CMA-MemNet &#x0029 . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic informat...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/CAA journal of automatica sinica 2020-07, Vol.7 (4), p.1038-1044
Hauptverfasser: Zhang, Yaojie, Xu, Bing, Zhao, Tiejun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network &#x0028 CMA-MemNet &#x0029 . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory network&#x02BC s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network &#x0028 RNN &#x0029 long short term memory &#x0028 LSTM &#x0029 , gated recurrent unit &#x0028 GRU &#x0029 models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently.
ISSN:2329-9266
2329-9274
DOI:10.1109/JAS.2020.1003243