An abstractive text summarization technique using transformer model with self-attention mechanism
Creating a summarized version of a text document that still conveys precise meaning is an incredibly complex endeavor in natural language processing (NLP). Abstract text summarization (ATS) is the process of using facts from source sentences and merging them into concise representations while mainta...
Gespeichert in:
Veröffentlicht in: | Neural computing & applications 2023-09, Vol.35 (25), p.18603-18622 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Creating a summarized version of a text document that still conveys precise meaning is an incredibly complex endeavor in natural language processing (NLP). Abstract text summarization (ATS) is the process of using facts from source sentences and merging them into concise representations while maintaining the content and intent of the text. Manually summarizing large amounts of text are challenging and time-consuming for humans. Therefore, text summarization has become an exciting research focus in NLP. This research paper proposed an ATS model using a
Transformer Technique with Self-Attention Mechanism (T2SAM)
. The self-attention mechanism is added to the transformer to solve the problem of coreference in text. This makes the system to understand the text better. The proposed
T2SAM
model improves the performance of text summarization. It is trained on the
Inshorts News
dataset combined with the
DUC-2004 shared tasks
dataset. The performance of the proposed model has been evaluated using the ROUGE metrics, and it has been shown to outperform the existing state-of-the-art baseline models. The proposed model gives the training loss minimum to 1.8220 from 10.3058 (at the starting point) up to 30 epochs, and it achieved model accuracy 48.50% F1-Score on both the
Inshorts
and DUC-2004 news datasets. |
---|---|
ISSN: | 0941-0643 1433-3058 |
DOI: | 10.1007/s00521-023-08687-7 |