Subgraph-Aware Few-Shot Inductive Link Prediction Via Meta-Learning

Link prediction for knowledge graphs aims to predict missing connections between entities. Prevailing methods are limited to a transductive setting and hard to process unseen entities. The recently proposed subgraph-based models provide alternatives to predict links from the subgraph structure surro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on knowledge and data engineering 2023-06, Vol.35 (6), p.6512-6517
Hauptverfasser: Zheng, Shuangjia, Mai, Sijie, Sun, Ya, Hu, Haifeng, Yang, Yuedong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Link prediction for knowledge graphs aims to predict missing connections between entities. Prevailing methods are limited to a transductive setting and hard to process unseen entities. The recently proposed subgraph-based models provide alternatives to predict links from the subgraph structure surrounding a candidate triplet. However, these methods require abundant known facts of training triplets and perform poorly on relationships that only have a few triplets. In this paper, we propose Meta-iKG, a novel subgraph-based meta-learner for few-shot inductive relation reasoning. Meta-iKG utilizes local subgraphs to transfer subgraph-specific information and to rapidly learn transferable patterns via meta-gradients. In this way, we find the model can quickly adapt to few-shot relationships using only a handful of known facts with inductive settings. Moreover, we introduce a large-shot relation updating procedure to ensure that our model can generalize well to both few-shot and large-shot relations. We evaluate Meta-iKG on inductive benchmarks sampled from the NELL and Freebase, and the results show that Meta-iKG outperforms the currently state-of-the-art methods in both few-shot scenarios and standard inductive settings.
ISSN:1041-4347
1558-2191
DOI:10.1109/TKDE.2022.3177212