Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension
In this work, we propose to use linguistic annotations as a basis for a \textit{Discourse-Aware Semantic Self-Attention} encoder that we employ for reading comprehension on long narrative texts. We extract relations between discourse units, events and their arguments as well as coreferring mentions,...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this work, we propose to use linguistic annotations as a basis for a
\textit{Discourse-Aware Semantic Self-Attention} encoder that we employ for
reading comprehension on long narrative texts. We extract relations between
discourse units, events and their arguments as well as coreferring mentions,
using available annotation tools. Our empirical evaluation shows that the
investigated structures improve the overall performance, especially
intra-sentential and cross-sentential discourse relations, sentence-internal
semantic role relations, and long-distance coreference relations. We show that
dedicating self-attention heads to intra-sentential relations and relations
connecting neighboring sentences is beneficial for finding answers to questions
in longer contexts. Our findings encourage the use of discourse-semantic
annotations to enhance the generalization capacity of self-attention models for
reading comprehension. |
---|---|
DOI: | 10.48550/arxiv.1908.10721 |