Representation iterative fusion based on heterogeneous graph neural network for joint entity and relation extraction

Joint entity and relation extraction is an essential task in information extraction, which aims to extract all relational triples from unstructured text. However, few existing works consider possible relations information between entities before extracting them, which may lead to the fact that most...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Knowledge-based systems 2021-05, Vol.219, p.106888, Article 106888
Hauptverfasser: Zhao, Kang, Xu, Hua, Cheng, Yue, Li, Xiaoteng, Gao, Kai
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Joint entity and relation extraction is an essential task in information extraction, which aims to extract all relational triples from unstructured text. However, few existing works consider possible relations information between entities before extracting them, which may lead to the fact that most of the extracted entities cannot constitute valid triples. In this paper, we propose a representation iterative fusion based on heterogeneous graph neural networks for relation extraction (RIFRE). We model relations and words as nodes on the graph and fuse the two types of semantic nodes by the message passing mechanism iteratively to obtain nodes representation that is more suitable for relation extraction tasks. The model performs relation extraction after nodes representation is updated. We evaluate RIFRE on two public relation extraction datasets: NYT and WebNLG. The results show that RIFRE can effectively extract triples and achieve state-of-the-art performance.11The code will be available at https://github.com/zhao9797/RIFRE. Moreover, RIFRE is also suitable for the relation classification task, and significantly outperforms the previous methods on SemEval 2010 Task 8 datasets.
ISSN:0950-7051
1872-7409
DOI:10.1016/j.knosys.2021.106888