Empowering Few-Shot Relation Extraction with The Integration of Traditional RE Methods and Large Language Models
Few-Shot Relation Extraction (FSRE), a subtask of Relation Extraction (RE) that utilizes limited training instances, appeals to more researchers in Natural Language Processing (NLP) due to its capability to extract textual information in extremely low-resource scenarios. The primary methodologies em...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Few-Shot Relation Extraction (FSRE), a subtask of Relation Extraction (RE)
that utilizes limited training instances, appeals to more researchers in
Natural Language Processing (NLP) due to its capability to extract textual
information in extremely low-resource scenarios. The primary methodologies
employed for FSRE have been fine-tuning or prompt tuning techniques based on
Pre-trained Language Models (PLMs). Recently, the emergence of Large Language
Models (LLMs) has prompted numerous researchers to explore FSRE through
In-Context Learning (ICL). However, there are substantial limitations
associated with methods based on either traditional RE models or LLMs.
Traditional RE models are hampered by a lack of necessary prior knowledge,
while LLMs fall short in their task-specific capabilities for RE. To address
these shortcomings, we propose a Dual-System Augmented Relation Extractor
(DSARE), which synergistically combines traditional RE models with LLMs.
Specifically, DSARE innovatively injects the prior knowledge of LLMs into
traditional RE models, and conversely enhances LLMs' task-specific aptitude for
RE through relation extraction augmentation. Moreover, an Integrated Prediction
module is employed to jointly consider these two respective predictions and
derive the final results. Extensive experiments demonstrate the efficacy of our
proposed method. |
---|---|
DOI: | 10.48550/arxiv.2407.08967 |