NAGphormer+: A Tokenized Graph Transformer With Neighborhood Augmentation for Node Classification in Large Graphs
Graph Transformers, emerging as a new architecture for graph representation learning, suffer from the quadratic complexity and can only handle graphs with at most thousands of nodes. To this end, we propose a Neighborhood Aggregation Graph Transformer (NAGphormer) that treats each node as a sequence...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on big data 2024-12, p.1-14 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Graph Transformers, emerging as a new architecture for graph representation learning, suffer from the quadratic complexity and can only handle graphs with at most thousands of nodes. To this end, we propose a Neighborhood Aggregation Graph Transformer (NAGphormer) that treats each node as a sequence containing a series of tokens constructed by our proposed Hop2Token module. For each node, Hop2Token aggregates the neighborhood features from different hops into different representations, producing a sequence of token vectors as one input. In this way, NAGphormer could be trained in a mini-batch manner and thus could scale to large graphs with millions of nodes. To further enhance the model's generalization, we propose NAGphormer+, an extended model of NAGphormer with a novel data augmentation method called Neighborhood Augmentation (NrAug). Based on the output of Hop2Token, NrAug simultaneously augments the features of neighborhoods from global as well as local views. In this way, NAGphormer+ can fully utilize the neighborhood information of multiple nodes, thereby undergoing more comprehensive training and improving the model's generalization capability. Extensive experiments on benchmark datasets from small to large demonstrate the superiority of NAGphormer+ against existing graph Transformers and mainstream GNNs, as well as the original NAGphormer. |
---|---|
ISSN: | 2332-7790 2372-2096 |
DOI: | 10.1109/TBDATA.2024.3524081 |