Infinite Width Graph Neural Networks for Node Regression/ Classification

This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Proce...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
1. Verfasser: Cobanoglu, Yunus
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Cobanoglu, Yunus
description This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels, both Machine Learning Frameworks with long traditions and extensive theoretical foundations. Gaussian Processes and Kernels have much less hyperparameters then Neural Networks and can be used for uncertainty estimation, making them more user friendly for applications. This works extends the increasing amount of research connecting Gaussian Processes and Kernels to Neural Networks. The Kernel and Gaussian Process closed forms are derived for a variety of architectures, namely the standard Graph Neural Network, the Graph Neural Network with Skip-Concatenate Connections and the Graph Attention Neural Network. All architectures are evaluated on a variety of datasets on the task of transductive Node Regression and Classification. Additionally, a Spectral Sparsification method known as Effective Resistance is used to improve runtime and memory requirements. Extending the setting to inductive graph learning tasks (Graph Regression/ Classification) is straightforward and is briefly discussed in 3.5.
doi_str_mv 10.48550/arxiv.2310.08176
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2310_08176</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2310_08176</sourcerecordid><originalsourceid>FETCH-LOGICAL-a676-f6a41d9ac3a80a8c79297a37a1862188121f13a11cdd3e3fe55dbe6482a5b2cf3</originalsourceid><addsrcrecordid>eNotj8tuwjAURL1hUUE_oKv6BwK5dvzIEkUtICEqVUhdRhf7GixCgpz09fdNaVdnZhYjHcYeIJ8XVql8gekrfsyFHIfcgtF3bL1pQ2zjQPwt-uHEVwmvJ76j94TNiOGzS-eehy7xXeeJv9IxUd_Hrl3wqsExhehwGPuMTQI2Pd3_c8r2z0_7ap1tX1abarnNUBudBY0F-BKdRJujdaYUpUFpEKwWYC0ICCARwHkvSQZSyh9IF1agOggX5JQ9_t3eVOprihdM3_WvUn1Tkj9830al</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Infinite Width Graph Neural Networks for Node Regression/ Classification</title><source>arXiv.org</source><creator>Cobanoglu, Yunus</creator><creatorcontrib>Cobanoglu, Yunus</creatorcontrib><description>This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels, both Machine Learning Frameworks with long traditions and extensive theoretical foundations. Gaussian Processes and Kernels have much less hyperparameters then Neural Networks and can be used for uncertainty estimation, making them more user friendly for applications. This works extends the increasing amount of research connecting Gaussian Processes and Kernels to Neural Networks. The Kernel and Gaussian Process closed forms are derived for a variety of architectures, namely the standard Graph Neural Network, the Graph Neural Network with Skip-Concatenate Connections and the Graph Attention Neural Network. All architectures are evaluated on a variety of datasets on the task of transductive Node Regression and Classification. Additionally, a Spectral Sparsification method known as Effective Resistance is used to improve runtime and memory requirements. Extending the setting to inductive graph learning tasks (Graph Regression/ Classification) is straightforward and is briefly discussed in 3.5.</description><identifier>DOI: 10.48550/arxiv.2310.08176</identifier><language>eng</language><subject>Computer Science - Learning</subject><creationdate>2023-10</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,777,882</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2310.08176$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2310.08176$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Cobanoglu, Yunus</creatorcontrib><title>Infinite Width Graph Neural Networks for Node Regression/ Classification</title><description>This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels, both Machine Learning Frameworks with long traditions and extensive theoretical foundations. Gaussian Processes and Kernels have much less hyperparameters then Neural Networks and can be used for uncertainty estimation, making them more user friendly for applications. This works extends the increasing amount of research connecting Gaussian Processes and Kernels to Neural Networks. The Kernel and Gaussian Process closed forms are derived for a variety of architectures, namely the standard Graph Neural Network, the Graph Neural Network with Skip-Concatenate Connections and the Graph Attention Neural Network. All architectures are evaluated on a variety of datasets on the task of transductive Node Regression and Classification. Additionally, a Spectral Sparsification method known as Effective Resistance is used to improve runtime and memory requirements. Extending the setting to inductive graph learning tasks (Graph Regression/ Classification) is straightforward and is briefly discussed in 3.5.</description><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tuwjAURL1hUUE_oKv6BwK5dvzIEkUtICEqVUhdRhf7GixCgpz09fdNaVdnZhYjHcYeIJ8XVql8gekrfsyFHIfcgtF3bL1pQ2zjQPwt-uHEVwmvJ76j94TNiOGzS-eehy7xXeeJv9IxUd_Hrl3wqsExhehwGPuMTQI2Pd3_c8r2z0_7ap1tX1abarnNUBudBY0F-BKdRJujdaYUpUFpEKwWYC0ICCARwHkvSQZSyh9IF1agOggX5JQ9_t3eVOprihdM3_WvUn1Tkj9830al</recordid><startdate>20231012</startdate><enddate>20231012</enddate><creator>Cobanoglu, Yunus</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20231012</creationdate><title>Infinite Width Graph Neural Networks for Node Regression/ Classification</title><author>Cobanoglu, Yunus</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a676-f6a41d9ac3a80a8c79297a37a1862188121f13a11cdd3e3fe55dbe6482a5b2cf3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Cobanoglu, Yunus</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Cobanoglu, Yunus</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Infinite Width Graph Neural Networks for Node Regression/ Classification</atitle><date>2023-10-12</date><risdate>2023</risdate><abstract>This work analyzes Graph Neural Networks, a generalization of Fully-Connected Deep Neural Nets on Graph structured data, when their width, that is the number of nodes in each fullyconnected layer is increasing to infinity. Infinite Width Neural Networks are connecting Deep Learning to Gaussian Processes and Kernels, both Machine Learning Frameworks with long traditions and extensive theoretical foundations. Gaussian Processes and Kernels have much less hyperparameters then Neural Networks and can be used for uncertainty estimation, making them more user friendly for applications. This works extends the increasing amount of research connecting Gaussian Processes and Kernels to Neural Networks. The Kernel and Gaussian Process closed forms are derived for a variety of architectures, namely the standard Graph Neural Network, the Graph Neural Network with Skip-Concatenate Connections and the Graph Attention Neural Network. All architectures are evaluated on a variety of datasets on the task of transductive Node Regression and Classification. Additionally, a Spectral Sparsification method known as Effective Resistance is used to improve runtime and memory requirements. Extending the setting to inductive graph learning tasks (Graph Regression/ Classification) is straightforward and is briefly discussed in 3.5.</abstract><doi>10.48550/arxiv.2310.08176</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2310.08176
ispartof
issn
language eng
recordid cdi_arxiv_primary_2310_08176
source arXiv.org
subjects Computer Science - Learning
title Infinite Width Graph Neural Networks for Node Regression/ Classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T17%3A00%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Infinite%20Width%20Graph%20Neural%20Networks%20for%20Node%20Regression/%20Classification&rft.au=Cobanoglu,%20Yunus&rft.date=2023-10-12&rft_id=info:doi/10.48550/arxiv.2310.08176&rft_dat=%3Carxiv_GOX%3E2310_08176%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true