InGram: Inductive Knowledge Graph Embedding via Relation Graphs

Inductive knowledge graph completion has been considered as the task of predicting missing triplets between new entities that are not observed during training. While most inductive knowledge graph completion methods assume that all entities can be new, they do not allow new relations to appear at in...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Lee, Jaejun, Chung, Chanyoung, Whang, Joyce Jiyoung
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Lee, Jaejun
Chung, Chanyoung
Whang, Joyce Jiyoung
description Inductive knowledge graph completion has been considered as the task of predicting missing triplets between new entities that are not observed during training. While most inductive knowledge graph completion methods assume that all entities can be new, they do not allow new relations to appear at inference time. This restriction prohibits the existing methods from appropriately handling real-world knowledge graphs where new entities accompany new relations. In this paper, we propose an INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time. Given a knowledge graph, we define a relation graph as a weighted graph consisting of relations and the affinity weights between them. Based on the relation graph and the original knowledge graph, InGram learns how to aggregate neighboring embeddings to generate relation and entity embeddings using an attention mechanism. Experimental results show that InGram outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.
doi_str_mv 10.48550/arxiv.2305.19987
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2305_19987</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2305_19987</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-e372d23b62995305085661c7e5f4ff1d1c675892f4ff9c3c6ba59169a49b78313</originalsourceid><addsrcrecordid>eNotj8uKwjAYhbNxIY4P4Mq8QDtN0lz-2cggXsoIA4P78jdJNdBGqVpn3t7brA6HD87hI2TCsjQ3Umbv2P2GPuUikykDMHpIZkVcddh-0CK6iz2H3tOveLg23u08vZPjni7ayjsX4o72AemPb_AcDvEFT29kUGNz8uP_HJHtcrGdr5PN96qYf24SVFonXmjuuKgUB5D398xIpZjVXtZ5XTPHrNLSAH80sMKqCiUwBZhDpY1gYkSmr9mnQXnsQovdX_kwKZ8m4gYgJkIN</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>InGram: Inductive Knowledge Graph Embedding via Relation Graphs</title><source>arXiv.org</source><creator>Lee, Jaejun ; Chung, Chanyoung ; Whang, Joyce Jiyoung</creator><creatorcontrib>Lee, Jaejun ; Chung, Chanyoung ; Whang, Joyce Jiyoung</creatorcontrib><description>Inductive knowledge graph completion has been considered as the task of predicting missing triplets between new entities that are not observed during training. While most inductive knowledge graph completion methods assume that all entities can be new, they do not allow new relations to appear at inference time. This restriction prohibits the existing methods from appropriately handling real-world knowledge graphs where new entities accompany new relations. In this paper, we propose an INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time. Given a knowledge graph, we define a relation graph as a weighted graph consisting of relations and the affinity weights between them. Based on the relation graph and the original knowledge graph, InGram learns how to aggregate neighboring embeddings to generate relation and entity embeddings using an attention mechanism. Experimental results show that InGram outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.</description><identifier>DOI: 10.48550/arxiv.2305.19987</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Learning</subject><creationdate>2023-05</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2305.19987$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2305.19987$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Lee, Jaejun</creatorcontrib><creatorcontrib>Chung, Chanyoung</creatorcontrib><creatorcontrib>Whang, Joyce Jiyoung</creatorcontrib><title>InGram: Inductive Knowledge Graph Embedding via Relation Graphs</title><description>Inductive knowledge graph completion has been considered as the task of predicting missing triplets between new entities that are not observed during training. While most inductive knowledge graph completion methods assume that all entities can be new, they do not allow new relations to appear at inference time. This restriction prohibits the existing methods from appropriately handling real-world knowledge graphs where new entities accompany new relations. In this paper, we propose an INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time. Given a knowledge graph, we define a relation graph as a weighted graph consisting of relations and the affinity weights between them. Based on the relation graph and the original knowledge graph, InGram learns how to aggregate neighboring embeddings to generate relation and entity embeddings using an attention mechanism. Experimental results show that InGram outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8uKwjAYhbNxIY4P4Mq8QDtN0lz-2cggXsoIA4P78jdJNdBGqVpn3t7brA6HD87hI2TCsjQ3Umbv2P2GPuUikykDMHpIZkVcddh-0CK6iz2H3tOveLg23u08vZPjni7ayjsX4o72AemPb_AcDvEFT29kUGNz8uP_HJHtcrGdr5PN96qYf24SVFonXmjuuKgUB5D398xIpZjVXtZ5XTPHrNLSAH80sMKqCiUwBZhDpY1gYkSmr9mnQXnsQovdX_kwKZ8m4gYgJkIN</recordid><startdate>20230531</startdate><enddate>20230531</enddate><creator>Lee, Jaejun</creator><creator>Chung, Chanyoung</creator><creator>Whang, Joyce Jiyoung</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230531</creationdate><title>InGram: Inductive Knowledge Graph Embedding via Relation Graphs</title><author>Lee, Jaejun ; Chung, Chanyoung ; Whang, Joyce Jiyoung</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-e372d23b62995305085661c7e5f4ff1d1c675892f4ff9c3c6ba59169a49b78313</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Lee, Jaejun</creatorcontrib><creatorcontrib>Chung, Chanyoung</creatorcontrib><creatorcontrib>Whang, Joyce Jiyoung</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lee, Jaejun</au><au>Chung, Chanyoung</au><au>Whang, Joyce Jiyoung</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>InGram: Inductive Knowledge Graph Embedding via Relation Graphs</atitle><date>2023-05-31</date><risdate>2023</risdate><abstract>Inductive knowledge graph completion has been considered as the task of predicting missing triplets between new entities that are not observed during training. While most inductive knowledge graph completion methods assume that all entities can be new, they do not allow new relations to appear at inference time. This restriction prohibits the existing methods from appropriately handling real-world knowledge graphs where new entities accompany new relations. In this paper, we propose an INductive knowledge GRAph eMbedding method, InGram, that can generate embeddings of new relations as well as new entities at inference time. Given a knowledge graph, we define a relation graph as a weighted graph consisting of relations and the affinity weights between them. Based on the relation graph and the original knowledge graph, InGram learns how to aggregate neighboring embeddings to generate relation and entity embeddings using an attention mechanism. Experimental results show that InGram outperforms 14 different state-of-the-art methods on varied inductive learning scenarios.</abstract><doi>10.48550/arxiv.2305.19987</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2305.19987
ispartof
issn
language eng
recordid cdi_arxiv_primary_2305_19987
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Learning
title InGram: Inductive Knowledge Graph Embedding via Relation Graphs
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T21%3A23%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=InGram:%20Inductive%20Knowledge%20Graph%20Embedding%20via%20Relation%20Graphs&rft.au=Lee,%20Jaejun&rft.date=2023-05-31&rft_id=info:doi/10.48550/arxiv.2305.19987&rft_dat=%3Carxiv_GOX%3E2305_19987%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true