Enhancing cross-evidence reasoning graph for document-level relation extraction
The objective of document-level relation extraction (RE) is to identify the semantic connections that exist between named entities present within a document. However, most entities are distributed among different sentences, there is a need for inter-entity relation prediction across sentences. Exist...
Gespeichert in:
Veröffentlicht in: | PeerJ. Computer science 2024-06, Vol.10, p.e2123 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The objective of document-level relation extraction (RE) is to identify the semantic connections that exist between named entities present within a document. However, most entities are distributed among different sentences, there is a need for inter-entity relation prediction across sentences. Existing research has focused on framing sentences throughout documents to predict relationships between entities. However, not all sentences play a substantial role in relation extraction, which inevitably introduces noisy information. Based on this phenomenon, we believe that we can extract evidence sentences in advance and use these evidence sentences to construct graphs to mine semantic information between entities. Thus, we present a document-level RE model that leverages an Enhancing Cross-evidence Reasoning Graph (ECRG) for improved performance. Specifically, we design an evidence extraction rule based on center-sentence to pre-extract higher-quality evidence. Then, this evidence is constructed into evidence graphs to mine the connections between mentions within the same evidence. In addition, we construct entity-level graphs by aggregating mentions from the same entities within the evidence graphs, aiming to capture distant interactions between entities. Experiments result on both DocRED and RE-DocRED datasets demonstrate that our model improves entity RE performance compared to existing work. |
---|---|
ISSN: | 2376-5992 2376-5992 |
DOI: | 10.7717/peerj-cs.2123 |