A Novel Joint Training Model for Knowledge Base Question Answering
In knowledge base question answering (KBQA) systems, relation detection and entity recognition are two core components. However, since the relation detection in KBQA contains thousands of relations and this task always becomes a zero-shot learning task due to the relations in some test samples while...
Gespeichert in:
Veröffentlicht in: | IEEE/ACM transactions on audio, speech, and language processing speech, and language processing, 2024, Vol.32, p.666-679 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In knowledge base question answering (KBQA) systems, relation detection and entity recognition are two core components. However, since the relation detection in KBQA contains thousands of relations and this task always becomes a zero-shot learning task due to the relations in some test samples while they have not appeared in training data, relation detection is more difficult than entity recognition. In addition, previous studies only considered these two tasks separately and did not take full advantage of their correlation. This article proposes a novel relation and entity joint extraction framework, named Gated-Attention-based Joint Training Model (Ga-JTM), to integrate relation detection and entity recognition. In addition, to train the two models simultaneously, a knowledge-driven gated unit based on a multi-head attention mechanism is designed. It combines the knowledge graph embeddings and the current context semantic information to process relation detection and entity recognition tasks, respectively. The experiments are conducted on a single-relation dataset (SimpleQuestions) and a multiple-relation dataset (WebQSP), and the experimental results demonstrate that our Ga-JTM is superior to the state-of-the-art (SOTA) performance of relation detection and can improve the performance of entity recognition and entity linking. Finally, these improvements contribute to the SOTA performance in our KBQA system. |
---|---|
ISSN: | 2329-9290 2329-9304 |
DOI: | 10.1109/TASLP.2023.3336526 |