Question-guided Knowledge Graph Re-scoring and Injection for Knowledge Graph Question Answering
Knowledge graph question answering (KGQA) involves answering natural language questions by leveraging structured information stored in a knowledge graph. Typically, KGQA initially retrieve a targeted subgraph from a large-scale knowledge graph, which serves as the basis for reasoning models to addre...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Knowledge graph question answering (KGQA) involves answering natural language
questions by leveraging structured information stored in a knowledge graph.
Typically, KGQA initially retrieve a targeted subgraph from a large-scale
knowledge graph, which serves as the basis for reasoning models to address
queries. However, the retrieved subgraph inevitably brings distraction
information for knowledge utilization, impeding the model's ability to perform
accurate reasoning. To address this issue, we propose a Question-guided
Knowledge Graph Re-scoring method (Q-KGR) to eliminate noisy pathways for the
input question, thereby focusing specifically on pertinent factual knowledge.
Moreover, we introduce Knowformer, a parameter-efficient method for injecting
the re-scored knowledge graph into large language models to enhance their
ability to perform factual reasoning. Extensive experiments on multiple KGQA
benchmarks demonstrate the superiority of our method over existing systems. |
---|---|
DOI: | 10.48550/arxiv.2410.01401 |