M-Graphormer: Multi-Channel Graph Transformer for Node Representation Learning
In recent years, the Graph Transformer has demonstrated superiority on various graph-level tasks by facilitating global interactions among nodes. However, as for node-level tasks, the existing Graph Transformer cannot perform as well as expected. Actually, a node in a real-world graph does not neces...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on big data 2024-10, p.1-13 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In recent years, the Graph Transformer has demonstrated superiority on various graph-level tasks by facilitating global interactions among nodes. However, as for node-level tasks, the existing Graph Transformer cannot perform as well as expected. Actually, a node in a real-world graph does not necessarily have relationships with every other node, and this global interaction weakens node features. This raises a fundamental question: should we partition out an appropriate interaction channel based on graph structure so that noisy and irrelevant information will be filtered and every node can aggregate information in the optimal channel? We first perform a series of experiments on manually created graphs with varying homophily ratios. Surprisingly, we observe that different graph structures indeed require distinct optimal interaction channels. This leads us to ask whether we can develop a partitioning rule that ensures each node interacts with relevant and valuable targets. To overcome this challenge, we propose a novel Graph Transformer named Multi-channel Graphormer. The model is evaluated on six network datasets with different homophily ratios for the node classification task. Moreover, comprehensive experiments are conducted on two real datasets for the recommendation task. Experimental results show that the Multi-channel Graphormer surpasses state-of-the-art baselines, demonstrating superior performance. |
---|---|
ISSN: | 2332-7790 2372-2096 |
DOI: | 10.1109/TBDATA.2024.3489418 |