Learning Word Embeddings for Aspect-Based Sentiment Analysis

Nowadays word embeddings, also known as word vectors, play an important role for many NLP tasks. In general, these word representations are learned from an unannotated corpus and they are independent from their applications. In this paper we aim to enrich the word vectors by adding more information...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Pham, Duc-Hong, Le, Anh-Cuong, Le, Thi-Kim-Chung
Format: Buchkapitel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Nowadays word embeddings, also known as word vectors, play an important role for many NLP tasks. In general, these word representations are learned from an unannotated corpus and they are independent from their applications. In this paper we aim to enrich the word vectors by adding more information derived from an application of them which is the aspect based sentiment analysis. We propose a new model using a combination of unsupervised and supervised techniques to capture the three kinds of information, including the general semantic distributed representation (i.e. the conventional word embeddings), and the aspect category and aspect sentiment from labeled and unlabeled data. We conduct experiments on the restaurant review data (http://spidr-ursa.rutgers.edu/datasets/). Experimental results show that our proposed model outperforms other methods as Word2Vec and GloVe.
ISSN:1865-0929
1865-0937
1530-9312
DOI:10.1007/978-981-10-8438-6_3