UMGCN: Updating multi-graph for graph convolutional networks

Graph fusion has delivered impressive performance in recent researches of graph convolutional networks. It essentially leverages multiple graphs sharing common node sets to learn representations. A widely-used scheme is to induce node representations from topology and feature graphs simultaneously....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers & electrical engineering 2025-04, Vol.123, p.109957, Article 109957
Hauptverfasser: Zhu, Guoquan, Liu, Keyu, Yang, Xibei, Guo, Qihang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Graph fusion has delivered impressive performance in recent researches of graph convolutional networks. It essentially leverages multiple graphs sharing common node sets to learn representations. A widely-used scheme is to induce node representations from topology and feature graphs simultaneously. Nevertheless, this scheme may face two challenges: (1) multi-order information of input graphs used in existing graph fusion methods is not sufficient; (2) existing methods fail to adaptively extract node features by multi-order graphs Therefore, we propose UMGCN, i.e., Updating Multi-graph for Graph Convolutional Networks, which can adaptively learn representations by renewing multiple graphs. Technically, UMGCN introduces multi-order graphs related to topology and feature graphs to capture multi-order information, extracting rich knowledge from distant but informative nodes. In addition, UMGCN implements the updating module including multi-order adaptive graphs that update and self-optimize graph structures progressively. Finally, UMGCN fuses all the representations learnt from above graphs for downstream tasks. Extensive experiments performed on seven benchmark datasets validate the effectiveness of UMGCN on the semi-supervised node classification task compared with several state-of-the-art methods.
ISSN:0045-7906
DOI:10.1016/j.compeleceng.2024.109957