Learning Adaptive Node Embeddings Across Graphs
Recently, learning embeddings of nodes in graphs has attracted increasing research attention. There are two main kinds of graph embedding methods, i.e., transductive embedding methods and inductive embedding methods. The former focuses on directly optimizing the embedding vectors, and the latter tri...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on knowledge and data engineering 2023-06, Vol.35 (6), p.6028-6042 |
---|---|
Hauptverfasser: | , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, learning embeddings of nodes in graphs has attracted increasing research attention. There are two main kinds of graph embedding methods, i.e., transductive embedding methods and inductive embedding methods. The former focuses on directly optimizing the embedding vectors, and the latter tries to learn a mapping function for the given nodes and features. However, little work has focused on applying the learned model from one graph to another, which is a pervasive idea in Computer Vision or Natural Language Processing. Although some of the graph neural networks (GNNs) present a similar motivation, none of them considers both the structure bias and the feature bias between graphs. In this paper, we present a novel graph embedding problem called Adaptive Task (AT), and propose a unified framework for the adaptive task, which introduces two types of alignment to learn adaptive node embeddings across graphs. Then, based on the proposed framework, a novel Graph Adaptive Embedding network (GraphAE) is designed to address the adaptive task. Furthermore, we extend GraphAE to a multi-graph version to consider a more complex adaptive situation. The extensive experimental results demonstrate that our model significantly outperforms the state-of-the-art methods, and also show that our framework can make a great improvement over a number of existing GNNs. |
---|---|
ISSN: | 1041-4347 1558-2191 |
DOI: | 10.1109/TKDE.2022.3160211 |