An innovative Telugu text summarization framework using the pointer network and optimized attention layer

Summarizing lengthy text involves distilling crucial information into a concise form by covering the key events in the source text. Previous researchers mostly explored the supervised approaches for the task, but due to its strong reliance on the quality of text features, the resulting summaries oft...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2024, Vol.83 (37), p.84539-84564
Hauptverfasser: M, Varaprasad Rao, Chakma, Kunal, Jamatia, Anupam, Rudrapal, Dwijen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Summarizing lengthy text involves distilling crucial information into a concise form by covering the key events in the source text. Previous researchers mostly explored the supervised approaches for the task, but due to its strong reliance on the quality of text features, the resulting summaries often lack precision and coherence. The performance of the state-of-the-art summarizers becomes poor when applied to various Indian languages due to several challenges. The current research paper proposes a summarization approach for Telugu text based on a Pointer Network with an Optimised Attention Layer (PN-OAL). In this approach, the attention layer’s weights are adjusted using a hybrid optimization method that combines the Fusion of Coyote Optimization and Squirrel Search method (FCO-SSA). Weight optimization is conducted in order to address the goal function, which involves the maximization of the cosine similarity between the source text and the summary text. A novel loss function is incorporated within the framework of the constructed network architecture to generate a robust summary. The proposed approach is experimented with alternative baseline approaches in order to validate its efficacy for summarizing Telugu documents.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-19187-8