A Short Text Similarity Calculation Method Combining Semantic and Headword Attention Mechanism

Short text similarity computation plays an important role in various natural language processing tasks. Siamese neural networks are widely used in short text similarity calculation. However, due to the complexity of syntax and the correlation between words, siamese networks alone cannot achieve sati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Scientific programming 2022-05, Vol.2022, p.1-9
Hauptverfasser: Ji, Mingyu, Zhang, Xinhai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Short text similarity computation plays an important role in various natural language processing tasks. Siamese neural networks are widely used in short text similarity calculation. However, due to the complexity of syntax and the correlation between words, siamese networks alone cannot achieve satisfactory results. Many studies show that the use of an attention mechanism will improve the impact of key features that can be utilized to measure sentence similarity. In this paper, a similarity calculation method is proposed which combines semantics and a headword attention mechanism. First, a BiGRU model is utilized to extract contextual information. After obtaining the headword set, the semantically enhanced representations of the two sentences are obtained through an attention mechanism and character splicing. Finally, we use a one-dimensional convolutional neural network to fuse the word embedding information with the contextual information. The experimental results on the ATEC and MSRP datasets show that the recall and F1 values of the proposed model are significantly improved through the introduction of the headword attention mechanism.
ISSN:1058-9244
1875-919X
DOI:10.1155/2022/8252492