Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning
Probabilistic logical rule learning has shown great strength in logical rule mining and knowledge graph completion. It learns logical rules to predict missing edges by reasoning on existing edges in the knowledge graph. However, previous efforts have largely been limited to only modeling chain-like...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Han, Chi He, Qizheng Yu, Charles Du, Xinya Tong, Hanghang Ji, Heng |
description | Probabilistic logical rule learning has shown great strength in logical rule
mining and knowledge graph completion. It learns logical rules to predict
missing edges by reasoning on existing edges in the knowledge graph. However,
previous efforts have largely been limited to only modeling chain-like Horn
clauses such as $R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)$. This formulation
overlooks additional contextual information from neighboring sub-graphs of
entity variables $x$, $y$ and $z$. Intuitively, there is a large gap here, as
local sub-graphs have been found to provide important information for knowledge
graph completion. Inspired by these observations, we propose Logical Entity
RePresentation (LERP) to encode contextual information of entities in the
knowledge graph. A LERP is designed as a vector of probabilistic logical
functions on the entity's neighboring sub-graph. It is an interpretable
representation while allowing for differentiable optimization. We can then
incorporate LERP into probabilistic logical rule learning to learn more
expressive rules. Empirical results demonstrate that with LERP, our model
outperforms other rule learning methods in knowledge graph completion and is
comparable or even superior to state-of-the-art black-box methods. Moreover, we
find that our model can discover a more expressive family of logical rules.
LERP can also be further combined with embedding learning methods like TransE
to make it more interpretable. |
doi_str_mv | 10.48550/arxiv.2305.12738 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2305_12738</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2305_12738</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-f10008bdbb458b3bc16e80e89252dcdde437906f6e5ca323e68ac38d082761d43</originalsourceid><addsrcrecordid>eNotz8tOwzAUBFBvWKDCB7DCP5DgR-y4y6qUgoiEFHUfXdvXwVLqRE549O8phc3MZjTSIeSOs7IySrEHyN_xsxSSqZKLWppr0jZjHx0MdJeWuJxoi1PGGdMCSxwTjYm-pvFrQN9jsc8wvc80jJk-xhAwn2cR7IC0_ThHg5BTTP0NuQowzHj73ytyeNodts9F87Z_2W6aAnRtisAZY8Z6aytlrLSOazQMzVoo4Z33WMl6zXTQqBxIIVEbcNJ4ZkStua_kitz_3V5Q3ZTjEfKp-8V1F5z8ATInSpA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><source>arXiv.org</source><creator>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</creator><creatorcontrib>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</creatorcontrib><description>Probabilistic logical rule learning has shown great strength in logical rule
mining and knowledge graph completion. It learns logical rules to predict
missing edges by reasoning on existing edges in the knowledge graph. However,
previous efforts have largely been limited to only modeling chain-like Horn
clauses such as $R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)$. This formulation
overlooks additional contextual information from neighboring sub-graphs of
entity variables $x$, $y$ and $z$. Intuitively, there is a large gap here, as
local sub-graphs have been found to provide important information for knowledge
graph completion. Inspired by these observations, we propose Logical Entity
RePresentation (LERP) to encode contextual information of entities in the
knowledge graph. A LERP is designed as a vector of probabilistic logical
functions on the entity's neighboring sub-graph. It is an interpretable
representation while allowing for differentiable optimization. We can then
incorporate LERP into probabilistic logical rule learning to learn more
expressive rules. Empirical results demonstrate that with LERP, our model
outperforms other rule learning methods in knowledge graph completion and is
comparable or even superior to state-of-the-art black-box methods. Moreover, we
find that our model can discover a more expressive family of logical rules.
LERP can also be further combined with embedding learning methods like TransE
to make it more interpretable.</description><identifier>DOI: 10.48550/arxiv.2305.12738</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Learning ; Computer Science - Logic in Computer Science</subject><creationdate>2023-05</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2305.12738$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2305.12738$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Han, Chi</creatorcontrib><creatorcontrib>He, Qizheng</creatorcontrib><creatorcontrib>Yu, Charles</creatorcontrib><creatorcontrib>Du, Xinya</creatorcontrib><creatorcontrib>Tong, Hanghang</creatorcontrib><creatorcontrib>Ji, Heng</creatorcontrib><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><description>Probabilistic logical rule learning has shown great strength in logical rule
mining and knowledge graph completion. It learns logical rules to predict
missing edges by reasoning on existing edges in the knowledge graph. However,
previous efforts have largely been limited to only modeling chain-like Horn
clauses such as $R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)$. This formulation
overlooks additional contextual information from neighboring sub-graphs of
entity variables $x$, $y$ and $z$. Intuitively, there is a large gap here, as
local sub-graphs have been found to provide important information for knowledge
graph completion. Inspired by these observations, we propose Logical Entity
RePresentation (LERP) to encode contextual information of entities in the
knowledge graph. A LERP is designed as a vector of probabilistic logical
functions on the entity's neighboring sub-graph. It is an interpretable
representation while allowing for differentiable optimization. We can then
incorporate LERP into probabilistic logical rule learning to learn more
expressive rules. Empirical results demonstrate that with LERP, our model
outperforms other rule learning methods in knowledge graph completion and is
comparable or even superior to state-of-the-art black-box methods. Moreover, we
find that our model can discover a more expressive family of logical rules.
LERP can also be further combined with embedding learning methods like TransE
to make it more interpretable.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Learning</subject><subject>Computer Science - Logic in Computer Science</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz8tOwzAUBFBvWKDCB7DCP5DgR-y4y6qUgoiEFHUfXdvXwVLqRE549O8phc3MZjTSIeSOs7IySrEHyN_xsxSSqZKLWppr0jZjHx0MdJeWuJxoi1PGGdMCSxwTjYm-pvFrQN9jsc8wvc80jJk-xhAwn2cR7IC0_ThHg5BTTP0NuQowzHj73ytyeNodts9F87Z_2W6aAnRtisAZY8Z6aytlrLSOazQMzVoo4Z33WMl6zXTQqBxIIVEbcNJ4ZkStua_kitz_3V5Q3ZTjEfKp-8V1F5z8ATInSpA</recordid><startdate>20230522</startdate><enddate>20230522</enddate><creator>Han, Chi</creator><creator>He, Qizheng</creator><creator>Yu, Charles</creator><creator>Du, Xinya</creator><creator>Tong, Hanghang</creator><creator>Ji, Heng</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230522</creationdate><title>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</title><author>Han, Chi ; He, Qizheng ; Yu, Charles ; Du, Xinya ; Tong, Hanghang ; Ji, Heng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-f10008bdbb458b3bc16e80e89252dcdde437906f6e5ca323e68ac38d082761d43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Learning</topic><topic>Computer Science - Logic in Computer Science</topic><toplevel>online_resources</toplevel><creatorcontrib>Han, Chi</creatorcontrib><creatorcontrib>He, Qizheng</creatorcontrib><creatorcontrib>Yu, Charles</creatorcontrib><creatorcontrib>Du, Xinya</creatorcontrib><creatorcontrib>Tong, Hanghang</creatorcontrib><creatorcontrib>Ji, Heng</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Han, Chi</au><au>He, Qizheng</au><au>Yu, Charles</au><au>Du, Xinya</au><au>Tong, Hanghang</au><au>Ji, Heng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning</atitle><date>2023-05-22</date><risdate>2023</risdate><abstract>Probabilistic logical rule learning has shown great strength in logical rule
mining and knowledge graph completion. It learns logical rules to predict
missing edges by reasoning on existing edges in the knowledge graph. However,
previous efforts have largely been limited to only modeling chain-like Horn
clauses such as $R_1(x,z)\land R_2(z,y)\Rightarrow H(x,y)$. This formulation
overlooks additional contextual information from neighboring sub-graphs of
entity variables $x$, $y$ and $z$. Intuitively, there is a large gap here, as
local sub-graphs have been found to provide important information for knowledge
graph completion. Inspired by these observations, we propose Logical Entity
RePresentation (LERP) to encode contextual information of entities in the
knowledge graph. A LERP is designed as a vector of probabilistic logical
functions on the entity's neighboring sub-graph. It is an interpretable
representation while allowing for differentiable optimization. We can then
incorporate LERP into probabilistic logical rule learning to learn more
expressive rules. Empirical results demonstrate that with LERP, our model
outperforms other rule learning methods in knowledge graph completion and is
comparable or even superior to state-of-the-art black-box methods. Moreover, we
find that our model can discover a more expressive family of logical rules.
LERP can also be further combined with embedding learning methods like TransE
to make it more interpretable.</abstract><doi>10.48550/arxiv.2305.12738</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2305.12738 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2305_12738 |
source | arXiv.org |
subjects | Computer Science - Artificial Intelligence Computer Science - Learning Computer Science - Logic in Computer Science |
title | Logical Entity Representation in Knowledge-Graphs for Differentiable Rule Learning |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-24T09%3A29%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Logical%20Entity%20Representation%20in%20Knowledge-Graphs%20for%20Differentiable%20Rule%20Learning&rft.au=Han,%20Chi&rft.date=2023-05-22&rft_id=info:doi/10.48550/arxiv.2305.12738&rft_dat=%3Carxiv_GOX%3E2305_12738%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |