Infinite Width Graph Neural Networks for Node Regression/ Classification
This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Proce...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This work analyzes Graph Neural Networks, a generalization of Fully-Connected
Deep Neural Nets on Graph structured data, when their width, that is the number
of nodes in each fullyconnected layer is increasing to infinity. Infinite Width
Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels,
both Machine Learning Frameworks with long traditions and extensive theoretical
foundations. Gaussian Processes and Kernels have much less hyperparameters then
Neural Networks and can be used for uncertainty estimation, making them more
user friendly for applications. This works extends the increasing amount of
research connecting Gaussian Processes and Kernels to Neural Networks. The
Kernel and Gaussian Process closed forms are derived for a variety of
architectures, namely the standard Graph Neural Network, the Graph Neural
Network with Skip-Concatenate Connections and the Graph Attention Neural
Network. All architectures are evaluated on a variety of datasets on the task
of transductive Node Regression and Classification. Additionally, a Spectral
Sparsification method known as Effective Resistance is used to improve runtime
and memory requirements. Extending the setting to inductive graph learning
tasks (Graph Regression/ Classification) is straightforward and is briefly
discussed in 3.5. |
---|---|
DOI: | 10.48550/arxiv.2310.08176 |