Swings and Roundabouts: Attention-Structure Interaction Effect in Deep Semantic Matching

In the context of deep learning models for semantic matching problems, we propose a novel Multi-View Progressive Attention (MV-PA) mechanism general enough to operate on various linguistic structures of text. More importantly, we study the interaction effect between explicit linguistic structures (e...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE/ACM transactions on audio, speech, and language processing speech, and language processing, 2020-01, Vol.28, p.1-1
Hauptverfasser: Gupta, Amulya, Zhang, Zhu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In the context of deep learning models for semantic matching problems, we propose a novel Multi-View Progressive Attention (MV-PA) mechanism general enough to operate on various linguistic structures of text. More importantly, we study the interaction effect between explicit linguistic structures (e.g., linear, constituency, and dependency) and implicit structures elicited by attention mechanisms. Empirical results on multiple datasets demonstrate salient patterns of substitutability between the two families of structures (explicit and implicit). Our findings not only provide intellectual foundations for the popular use of "linear LSTM + attention" architectures in NLP/QA research, but also have implications in other modalities and domains.
ISSN:2329-9290
2329-9304
DOI:10.1109/TASLP.2020.3013703