CRE-LLM: A Domain-Specific Chinese Relation Extraction Framework with Fine-tuned Large Language Model
Domain-Specific Chinese Relation Extraction (DSCRE) aims to extract relations between entities from domain-specific Chinese text. Despite the rapid development of PLMs in recent years, especially LLMs, DSCRE still faces three core challenges: complex network structure design, poor awareness, and hig...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Domain-Specific Chinese Relation Extraction (DSCRE) aims to extract relations
between entities from domain-specific Chinese text. Despite the rapid
development of PLMs in recent years, especially LLMs, DSCRE still faces three
core challenges: complex network structure design, poor awareness, and high
consumption of fine-tuning. Given the impressive performance of large language
models (LLMs) in natural language processing, we propose a new framework called
CRE-LLM. This framework is based on fine-tuning open-source LLMs, such as
Llama-2, ChatGLM2, and Baichuan2. CRE-LLM enhances the logic-awareness and
generative capabilities of the model by constructing an appropriate prompt and
utilizing open-source LLMs for instruction-supervised fine-tuning. And then it
directly extracts the relations of the given entities in the input textual
data, which improving the CRE approach. To demonstrate the effectiveness of the
proposed framework, we conducted extensive experiments on two domain-specific
CRE datasets, FinRE and SanWen. The experimental results show that CRE-LLM is
significantly superior and robust, achieving state-of-the-art (SOTA)
performance on the FinRE dataset. This paper introduces a novel approach to
domain-specific relation extraction (DSCRE) tasks that are semantically more
complex by combining LLMs with triples. Our code is publicly available. |
---|---|
DOI: | 10.48550/arxiv.2404.18085 |