SenticGAT: Sentiment Knowledge Enhanced Graph Attention Network for Multi-view Feature Representation in Aspect-based Sentiment Analysis

Currently, computational intelligence methods, especially artificial neural networks, are increasingly applied to many scenarios. We mainly attempt to explore the task of fine-grained sentiment classification of review data through computational intelligence methods, especially artificial neural net...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of computers, communications & control communications & control, 2023-10, Vol.18 (5)
Hauptverfasser: Yang, Bin, Li, Haoling, Xing, Ying
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Currently, computational intelligence methods, especially artificial neural networks, are increasingly applied to many scenarios. We mainly attempt to explore the task of fine-grained sentiment classification of review data through computational intelligence methods, especially artificial neural networks, and this task is also known as aspect-based sentiment analysis (ABSA). We propose a new technique called SenticGAT which is a multi-view features fusion model enhanced by an external sentiment database. We encode the external sentiment information into the syntactic dependency tree to obtain an enhanced graph with rich sentiment representation. Then we obtain multi-view features including semantics, syntactic, and sentiment features through GAT based on the enhanced graph by external knowledge. We also design a new strategy for fusing multi-view features using the feature parallel frame and convolution method. Eventually, the sentiment polarity of a specific aspect is determined based on the completely fused multi-view features. Experimental results on four public benchmark datasets demonstrate that our method is effective and sound. And it performs superiorly to existing approaches in fusion multiple-view features.
ISSN:1841-9836
1841-9844
DOI:10.15837/ijccc.2023.5.5089