Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning

Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2023-05
Hauptverfasser: Han, Chi, He, Qizheng, Yu, Charles, Du, Xinya, Tong, Hanghang, Ji, Heng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Han, Chi
He, Qizheng
Yu, Charles
Du, Xinya
Tong, Hanghang
Ji, Heng
description Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like Horn clauses such as \(R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)\). This formulation overlooks additional contextual information from neighboring sub-graphs of entity variables \(x\), \(y\) and \(z\). Intuitively, there is a large gap here, as local sub-graphs have been found to provide important information for knowledge graph completion. Inspired by these observations, we propose Logical Entity RePresentation (LERP) to encode contextual information of entities in the knowledge graph. A LERP is designed as a vector of probabilistic logical functions on the entity's neighboring sub-graph. It is an interpretable representation while allowing for differentiable optimization. We can then incorporate LERP into probabilistic logical rule learning to learn more expressive rules. Empirical results demonstrate that with LERP, our model outperforms other rule learning methods in knowledge graph completion and is comparable or even superior to state-of-the-art black-box methods. Moreover, we find that our model can discover a more expressive family of logical rules. LERP can also be further combined with embedding learning methods like TransE to make it more interpretable.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2817858707</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2817858707</sourcerecordid><originalsourceid>FETCH-proquest_journals_28178587073</originalsourceid><addsrcrecordid>eNqNirEKwjAUAIMgWLT_EHAupIk12bUq2Km4S9SXmhJeapIi_r0d_ACXu-FuRjIuRFmoDecLksfYM8b4VvKqEhlpG9_Zu3a0xmTTh7YwBIiASSfrkVqkZ_RvB48OimPQwzNS4wPdW2MgTJvVNwe0HSc0oANa7FZkbrSLkP-8JOtDfdmdiiH41wgxXXs_BpzSlatSqkpJJsV_1xcMJT_4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2817858707</pqid></control><display><type>article</type><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><source>Free E- Journals</source><creator>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</creator><creatorcontrib>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</creatorcontrib><description>Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like Horn clauses such as \(R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)\). This formulation overlooks additional contextual information from neighboring sub-graphs of entity variables \(x\), \(y\) and \(z\). Intuitively, there is a large gap here, as local sub-graphs have been found to provide important information for knowledge graph completion. Inspired by these observations, we propose Logical Entity RePresentation (LERP) to encode contextual information of entities in the knowledge graph. A LERP is designed as a vector of probabilistic logical functions on the entity's neighboring sub-graph. It is an interpretable representation while allowing for differentiable optimization. We can then incorporate LERP into probabilistic logical rule learning to learn more expressive rules. Empirical results demonstrate that with LERP, our model outperforms other rule learning methods in knowledge graph completion and is comparable or even superior to state-of-the-art black-box methods. Moreover, we find that our model can discover a more expressive family of logical rules. LERP can also be further combined with embedding learning methods like TransE to make it more interpretable.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Graph theory ; Graphical representations ; Graphs ; Knowledge ; Knowledge representation ; Learning ; Optimization ; Teaching methods</subject><ispartof>arXiv.org, 2023-05</ispartof><rights>2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Han, Chi</creatorcontrib><creatorcontrib>He, Qizheng</creatorcontrib><creatorcontrib>Yu, Charles</creatorcontrib><creatorcontrib>Du, Xinya</creatorcontrib><creatorcontrib>Tong, Hanghang</creatorcontrib><creatorcontrib>Ji, Heng</creatorcontrib><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><title>arXiv.org</title><description>Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like Horn clauses such as \(R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)\). This formulation overlooks additional contextual information from neighboring sub-graphs of entity variables \(x\), \(y\) and \(z\). Intuitively, there is a large gap here, as local sub-graphs have been found to provide important information for knowledge graph completion. Inspired by these observations, we propose Logical Entity RePresentation (LERP) to encode contextual information of entities in the knowledge graph. A LERP is designed as a vector of probabilistic logical functions on the entity's neighboring sub-graph. It is an interpretable representation while allowing for differentiable optimization. We can then incorporate LERP into probabilistic logical rule learning to learn more expressive rules. Empirical results demonstrate that with LERP, our model outperforms other rule learning methods in knowledge graph completion and is comparable or even superior to state-of-the-art black-box methods. Moreover, we find that our model can discover a more expressive family of logical rules. LERP can also be further combined with embedding learning methods like TransE to make it more interpretable.</description><subject>Graph theory</subject><subject>Graphical representations</subject><subject>Graphs</subject><subject>Knowledge</subject><subject>Knowledge representation</subject><subject>Learning</subject><subject>Optimization</subject><subject>Teaching methods</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNirEKwjAUAIMgWLT_EHAupIk12bUq2Km4S9SXmhJeapIi_r0d_ACXu-FuRjIuRFmoDecLksfYM8b4VvKqEhlpG9_Zu3a0xmTTh7YwBIiASSfrkVqkZ_RvB48OimPQwzNS4wPdW2MgTJvVNwe0HSc0oANa7FZkbrSLkP-8JOtDfdmdiiH41wgxXXs_BpzSlatSqkpJJsV_1xcMJT_4</recordid><startdate>20230522</startdate><enddate>20230522</enddate><creator>Han, Chi</creator><creator>He, Qizheng</creator><creator>Yu, Charles</creator><creator>Du, Xinya</creator><creator>Tong, Hanghang</creator><creator>Ji, Heng</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20230522</creationdate><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><author>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28178587073</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Graph theory</topic><topic>Graphical representations</topic><topic>Graphs</topic><topic>Knowledge</topic><topic>Knowledge representation</topic><topic>Learning</topic><topic>Optimization</topic><topic>Teaching methods</topic><toplevel>online_resources</toplevel><creatorcontrib>Han, Chi</creatorcontrib><creatorcontrib>He, Qizheng</creatorcontrib><creatorcontrib>Yu, Charles</creatorcontrib><creatorcontrib>Du, Xinya</creatorcontrib><creatorcontrib>Tong, Hanghang</creatorcontrib><creatorcontrib>Ji, Heng</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Han, Chi</au><au>He, Qizheng</au><au>Yu, Charles</au><au>Du, Xinya</au><au>Tong, Hanghang</au><au>Ji, Heng</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</atitle><jtitle>arXiv.org</jtitle><date>2023-05-22</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like Horn clauses such as \(R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)\). This formulation overlooks additional contextual information from neighboring sub-graphs of entity variables \(x\), \(y\) and \(z\). Intuitively, there is a large gap here, as local sub-graphs have been found to provide important information for knowledge graph completion. Inspired by these observations, we propose Logical Entity RePresentation (LERP) to encode contextual information of entities in the knowledge graph. A LERP is designed as a vector of probabilistic logical functions on the entity's neighboring sub-graph. It is an interpretable representation while allowing for differentiable optimization. We can then incorporate LERP into probabilistic logical rule learning to learn more expressive rules. Empirical results demonstrate that with LERP, our model outperforms other rule learning methods in knowledge graph completion and is comparable or even superior to state-of-the-art black-box methods. Moreover, we find that our model can discover a more expressive family of logical rules. LERP can also be further combined with embedding learning methods like TransE to make it more interpretable.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2023-05
issn 2331-8422
language eng
recordid cdi_proquest_journals_2817858707
source Free E- Journals
subjects Graph theory
Graphical representations
Graphs
Knowledge
Knowledge representation
Learning
Optimization
Teaching methods
title Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-03T10%3A53%3A40IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Logical%20Entity%20Representation%20in%20Knowledge-Graphs%20for%20Differentiable%20Rule%20Learning&rft.jtitle=arXiv.org&rft.au=Han,%20Chi&rft.date=2023-05-22&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2817858707%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2817858707&rft_id=info:pmid/&rfr_iscdi=true