Prompt-Learning for Cross-Lingual Relation Extraction
Relation Extraction (RE) is a crucial task in Information Extraction, which entails predicting relationships between entities within a given sentence. However, extending pre-trained RE models to other languages is challenging, particularly in real-world scenarios where Cross-Lingual Relation Extract...
Gespeichert in:
Hauptverfasser: | , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Relation Extraction (RE) is a crucial task in Information Extraction, which
entails predicting relationships between entities within a given sentence.
However, extending pre-trained RE models to other languages is challenging,
particularly in real-world scenarios where Cross-Lingual Relation Extraction
(XRE) is required. Despite recent advancements in Prompt-Learning, which
involves transferring knowledge from Multilingual Pre-trained Language Models
(PLMs) to diverse downstream tasks, there is limited research on the effective
use of multilingual PLMs with prompts to improve XRE. In this paper, we present
a novel XRE algorithm based on Prompt-Tuning, referred to as Prompt-XRE. To
evaluate its effectiveness, we design and implement several prompt templates,
including hard, soft, and hybrid prompts, and empirically test their
performance on competitive multilingual PLMs, specifically mBART. Our extensive
experiments, conducted on the low-resource ACE05 benchmark across multiple
languages, demonstrate that our Prompt-XRE algorithm significantly outperforms
both vanilla multilingual PLMs and other existing models, achieving
state-of-the-art performance in XRE. To further show the generalization of our
Prompt-XRE on larger data scales, we construct and release a new XRE dataset-
WMT17-EnZh XRE, containing 0.9M English-Chinese pairs extracted from WMT 2017
parallel corpus. Experiments on WMT17-EnZh XRE also show the effectiveness of
our Prompt-XRE against other competitive baselines. The code and newly
constructed dataset are freely available at
\url{https://github.com/HSU-CHIA-MING/Prompt-XRE}. |
---|---|
DOI: | 10.48550/arxiv.2304.10354 |