Convolutional multi-head self-attention on memory for aspect sentiment classification
This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic informat...
Gespeichert in:
Veröffentlicht in: | IEEE/CAA journal of automatica sinica 2020-07, Vol.7 (4), p.1038-1044 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper presents a method for aspect based sentiment classification tasks, named convolutional multi-head self-attention memory network ( CMA-MemNet ) . This is an improved model based on memory networks, and makes it possible to extract more rich and complex semantic information from sequences and aspects. In order to fix the memory networkʼ s inability to capture context-related information on a word-level, we propose utilizing convolution to capture n-gram grammatical information. We use multi-head self-attention to make up for the problem where the memory network ignores the semantic information of the sequence itself. Meanwhile, unlike most recurrent neural network ( RNN ) long short term memory ( LSTM ) , gated recurrent unit ( GRU ) models, we retain the parallelism of the network. We experiment on the open datasets SemEval-2014 Task 4 and SemEval-2016 Task 6. Compared with some popular baseline methods, our model performs excellently. |
---|---|
ISSN: | 2329-9266 2329-9274 |
DOI: | 10.1109/JAS.2020.1003243 |