Logical Message Passing Networks with One-hop Inference on Atomic Formulas
Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot of attention to potentially support many applications. Given that KGs are usually incomplete, neural models are proposed to answer the logical queries by parameterizing set operators with complex neural networks. However,...
Gespeichert in:
Hauptverfasser: | , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Complex Query Answering (CQA) over Knowledge Graphs (KGs) has attracted a lot
of attention to potentially support many applications. Given that KGs are
usually incomplete, neural models are proposed to answer the logical queries by
parameterizing set operators with complex neural networks. However, such
methods usually train neural set operators with a large number of entity and
relation embeddings from the zero, where whether and how the embeddings or the
neural set operators contribute to the performance remains not clear. In this
paper, we propose a simple framework for complex query answering that
decomposes the KG embeddings from neural set operators. We propose to represent
the complex queries into the query graph. On top of the query graph, we propose
the Logical Message Passing Neural Network (LMPNN) that connects the local
one-hop inferences on atomic formulas to the global logical reasoning for
complex query answering. We leverage existing effective KG embeddings to
conduct one-hop inferences on atomic formulas, the results of which are
regarded as the messages passed in LMPNN. The reasoning process over the
overall logical formulas is turned into the forward pass of LMPNN that
incrementally aggregates local information to finally predict the answers'
embeddings. The complex logical inference across different types of queries
will then be learned from training examples based on the LMPNN architecture.
Theoretically, our query-graph represenation is more general than the
prevailing operator-tree formulation, so our approach applies to a broader
range of complex KG queries. Empirically, our approach yields the new
state-of-the-art neural CQA model. Our research bridges the gap between complex
KG query answering tasks and the long-standing achievements of knowledge graph
representation learning. |
---|---|
DOI: | 10.48550/arxiv.2301.08859 |