Logical Message Passing Networks with One-hop Inference on Atomic Formulas

Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2023-03
Hauptverfasser: Wang, Zihao, Song, Yangqiu, Wong, Ginny Y, See, Simon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Wang, Zihao
Song, Yangqiu
Wong, Ginny Y
See, Simon
description Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However, such methods usually train neural set operators with a large number of entity and relation embeddings from the zero, where whether and how the embeddings or the neural set operators contribute to the performance remains not clear. In this paper, we propose a simple framework for complex query answering that decomposes the KG embeddings from neural set operators. We propose to represent the complex queries into the query graph. On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning for complex query answering. We leverage existing effective KG embeddings to conduct one-hop inferences on atomic formulas, the results of which are regarded as the messages passed in LMPNN. The reasoning process over the overall logical formulas is turned into the forward pass of LMPNN that incrementally aggregates local information to finally predict the answers' embeddings. The complex logical inference across different types of queries will then be learned from training examples based on the LMPNN architecture. Theoretically, our query-graph represenation is more general than the prevailing operator-tree formulation, so our approach applies to a broader range of complex KG queries. Empirically, our approach yields the new state-of-the-art neural CQA model. Our research bridges the gap between complex KG query answering tasks and the long-standing achievements of knowledge graph representation learning.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2768912192</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2768912192</sourcerecordid><originalsourceid>FETCH-proquest_journals_27689121923</originalsourceid><addsrcrecordid>eNqNyssKgkAUgOEhCJLyHQ60FnQmb8uIpKLbor0MctQxnbE5I71-LXqAVv_i-2fM40JEQbbhfMF8oi4MQ56kPI6Fx05n06hK9nBBItkg3CWR0g1c0b2NfRK8lWvhpjFozQhHXaNFXSEYDVtnBlVBYeww9ZJWbF7LntD_dcnWxf6xOwSjNa8JyZWdmaz-UsnTJMsjHuVc_Hd9AJO1PGM</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2768912192</pqid></control><display><type>article</type><title>Logical Message Passing Networks with One-hop Inference on Atomic Formulas</title><source>Free E- Journals</source><creator>Wang, Zihao ; Song, Yangqiu ; Wong, Ginny Y ; See, Simon</creator><creatorcontrib>Wang, Zihao ; Song, Yangqiu ; Wong, Ginny Y ; See, Simon</creatorcontrib><description>Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However, such methods usually train neural set operators with a large number of entity and relation embeddings from the zero, where whether and how the embeddings or the neural set operators contribute to the performance remains not clear. In this paper, we propose a simple framework for complex query answering that decomposes the KG embeddings from neural set operators. We propose to represent the complex queries into the query graph. On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning for complex query answering. We leverage existing effective KG embeddings to conduct one-hop inferences on atomic formulas, the results of which are regarded as the messages passed in LMPNN. The reasoning process over the overall logical formulas is turned into the forward pass of LMPNN that incrementally aggregates local information to finally predict the answers' embeddings. The complex logical inference across different types of queries will then be learned from training examples based on the LMPNN architecture. Theoretically, our query-graph represenation is more general than the prevailing operator-tree formulation, so our approach applies to a broader range of complex KG queries. Empirically, our approach yields the new state-of-the-art neural CQA model. Our research bridges the gap between complex KG query answering tasks and the long-standing achievements of knowledge graph representation learning.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Cognition &amp; reasoning ; Graph representations ; Graphical representations ; Inference ; Knowledge representation ; Message passing ; Neural networks ; Operators ; Queries ; Reasoning ; Task complexity</subject><ispartof>arXiv.org, 2023-03</ispartof><rights>2023. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>778,782</link.rule.ids></links><search><creatorcontrib>Wang, Zihao</creatorcontrib><creatorcontrib>Song, Yangqiu</creatorcontrib><creatorcontrib>Wong, Ginny Y</creatorcontrib><creatorcontrib>See, Simon</creatorcontrib><title>Logical Message Passing Networks with One-hop Inference on Atomic Formulas</title><title>arXiv.org</title><description>Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However, such methods usually train neural set operators with a large number of entity and relation embeddings from the zero, where whether and how the embeddings or the neural set operators contribute to the performance remains not clear. In this paper, we propose a simple framework for complex query answering that decomposes the KG embeddings from neural set operators. We propose to represent the complex queries into the query graph. On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning for complex query answering. We leverage existing effective KG embeddings to conduct one-hop inferences on atomic formulas, the results of which are regarded as the messages passed in LMPNN. The reasoning process over the overall logical formulas is turned into the forward pass of LMPNN that incrementally aggregates local information to finally predict the answers' embeddings. The complex logical inference across different types of queries will then be learned from training examples based on the LMPNN architecture. Theoretically, our query-graph represenation is more general than the prevailing operator-tree formulation, so our approach applies to a broader range of complex KG queries. Empirically, our approach yields the new state-of-the-art neural CQA model. Our research bridges the gap between complex KG query answering tasks and the long-standing achievements of knowledge graph representation learning.</description><subject>Cognition &amp; reasoning</subject><subject>Graph representations</subject><subject>Graphical representations</subject><subject>Inference</subject><subject>Knowledge representation</subject><subject>Message passing</subject><subject>Neural networks</subject><subject>Operators</subject><subject>Queries</subject><subject>Reasoning</subject><subject>Task complexity</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNyssKgkAUgOEhCJLyHQ60FnQmb8uIpKLbor0MctQxnbE5I71-LXqAVv_i-2fM40JEQbbhfMF8oi4MQ56kPI6Fx05n06hK9nBBItkg3CWR0g1c0b2NfRK8lWvhpjFozQhHXaNFXSEYDVtnBlVBYeww9ZJWbF7LntD_dcnWxf6xOwSjNa8JyZWdmaz-UsnTJMsjHuVc_Hd9AJO1PGM</recordid><startdate>20230331</startdate><enddate>20230331</enddate><creator>Wang, Zihao</creator><creator>Song, Yangqiu</creator><creator>Wong, Ginny Y</creator><creator>See, Simon</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20230331</creationdate><title>Logical Message Passing Networks with One-hop Inference on Atomic Formulas</title><author>Wang, Zihao ; Song, Yangqiu ; Wong, Ginny Y ; See, Simon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_27689121923</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Cognition &amp; reasoning</topic><topic>Graph representations</topic><topic>Graphical representations</topic><topic>Inference</topic><topic>Knowledge representation</topic><topic>Message passing</topic><topic>Neural networks</topic><topic>Operators</topic><topic>Queries</topic><topic>Reasoning</topic><topic>Task complexity</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Zihao</creatorcontrib><creatorcontrib>Song, Yangqiu</creatorcontrib><creatorcontrib>Wong, Ginny Y</creatorcontrib><creatorcontrib>See, Simon</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Zihao</au><au>Song, Yangqiu</au><au>Wong, Ginny Y</au><au>See, Simon</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Logical Message Passing Networks with One-hop Inference on Atomic Formulas</atitle><jtitle>arXiv.org</jtitle><date>2023-03-31</date><risdate>2023</risdate><eissn>2331-8422</eissn><abstract>Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However, such methods usually train neural set operators with a large number of entity and relation embeddings from the zero, where whether and how the embeddings or the neural set operators contribute to the performance remains not clear. In this paper, we propose a simple framework for complex query answering that decomposes the KG embeddings from neural set operators. We propose to represent the complex queries into the query graph. On top of the query graph, we propose the Logical Message Passing Neural Network (LMPNN) that connects the local one-hop inferences on atomic formulas to the global logical reasoning for complex query answering. We leverage existing effective KG embeddings to conduct one-hop inferences on atomic formulas, the results of which are regarded as the messages passed in LMPNN. The reasoning process over the overall logical formulas is turned into the forward pass of LMPNN that incrementally aggregates local information to finally predict the answers' embeddings. The complex logical inference across different types of queries will then be learned from training examples based on the LMPNN architecture. Theoretically, our query-graph represenation is more general than the prevailing operator-tree formulation, so our approach applies to a broader range of complex KG queries. Empirically, our approach yields the new state-of-the-art neural CQA model. Our research bridges the gap between complex KG query answering tasks and the long-standing achievements of knowledge graph representation learning.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2023-03
issn 2331-8422
language eng
recordid cdi_proquest_journals_2768912192
source Free E- Journals
subjects Cognition & reasoning
Graph representations
Graphical representations
Inference
Knowledge representation
Message passing
Neural networks
Operators
Queries
Reasoning
Task complexity
title Logical Message Passing Networks with One-hop Inference on Atomic Formulas
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T08%3A41%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Logical%20Message%20Passing%20Networks%20with%20One-hop%20Inference%20on%20Atomic%20Formulas&rft.jtitle=arXiv.org&rft.au=Wang,%20Zihao&rft.date=2023-03-31&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2768912192%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2768912192&rft_id=info:pmid/&rfr_iscdi=true