Enhancing Arabic Aspect-Based Sentiment Analysis Using End-to-End Model
The majority of research on the Aspect-Based Sentiment Analysis (ABSA) tends to split this task into two subtasks: one for extracting aspects, Aspect Term Extraction (ATE), and another for identifying sentiments toward particular aspects, Aspect Sentiment Classification (ASC). Although these subtask...
Gespeichert in:
Veröffentlicht in: | IEEE access 2023, Vol.11, p.142062-142076 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The majority of research on the Aspect-Based Sentiment Analysis (ABSA) tends to split this task into two subtasks: one for extracting aspects, Aspect Term Extraction (ATE), and another for identifying sentiments toward particular aspects, Aspect Sentiment Classification (ASC). Although these subtasks are closely related, they are performed independently; while performing the Aspect Sentiment Classification task, it is assumed that the aspect terms are pre-identified, which ignores the practical interaction required to properly perform the ABSA. This study addresses these limitations using a unified End-to-End (E2E) approach, which combines the two subtasks into a single sequence labeling task using a unified tagging schema. The proposed model was evaluated by fine-tuning the Arabic version of the Bidirectional Encoder Representations from Transformers (AraBERT) model with a Conditional Random Fields (CRF) classifier for enhanced target-polarity identification. The experimental results demonstrated the efficiency of the proposed fine-tuned AraBERT-CRF model, which achieved an overall F1 score of 95.11% on the SemEval-2016 Arabic Hotel Reviews dataset. The model's predictions are then subjected to additional processing, and the results indicate the superiority of the proposed model, achieving an F1 score of 97.78% for the ATE task and an accuracy of 98.34% for the ASC task, outperforming previous studies. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3342755 |