LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation
Recent works have demonstrated the benefits of capturing long-distance dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs suffer from the long-lasting scalability challenge due to the neighborhood explosion problem in large-scale graphs. In this work, we propose to capture...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recent works have demonstrated the benefits of capturing long-distance
dependency in graphs by deeper graph neural networks (GNNs). But deeper GNNs
suffer from the long-lasting scalability challenge due to the neighborhood
explosion problem in large-scale graphs. In this work, we propose to capture
long-distance dependency in graphs by shallower models instead of deeper
models, which leads to a much more efficient model, LazyGNN, for graph
representation learning. Moreover, we demonstrate that LazyGNN is compatible
with existing scalable approaches (such as sampling methods) for further
accelerations through the development of mini-batch LazyGNN. Comprehensive
experiments demonstrate its superior prediction performance and scalability on
large-scale benchmarks. The implementation of LazyGNN is available at
https://github.com/RXPHD/Lazy_GNN. |
---|---|
DOI: | 10.48550/arxiv.2302.01503 |