Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations

Graph Neural Networks (GNNs) have seen significant advances in recent years, yet their application to multigraphs, where parallel edges exist between the same pair of nodes, remains under-explored. Standard GNNs, designed for simple graphs, compute node representations by combining all connected edg...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-12
Hauptverfasser: Bilgi, H Çağrı, Chen, Lydia Y, Atasu, Kubilay
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Bilgi, H Çağrı
Chen, Lydia Y
Atasu, Kubilay
description Graph Neural Networks (GNNs) have seen significant advances in recent years, yet their application to multigraphs, where parallel edges exist between the same pair of nodes, remains under-explored. Standard GNNs, designed for simple graphs, compute node representations by combining all connected edges at once, without distinguishing between edges from different neighbors. There are some GNN architectures proposed specifically for multigraphs, yet these architectures perform only node-level aggregation in their message passing layers, which limits their expressive power. Furthermore, these approaches either lack permutation equivariance when a strict total edge ordering is absent, or fail to preserve the topological structure of the multigraph. To address all these shortcomings, we propose MEGA-GNN, a unified framework for message passing on multigraphs that can effectively perform diverse graph learning tasks. Our approach introduces a two-stage aggregation process in the message passing layers: first, parallel edges are aggregated, followed by a node-level aggregation of messages from distinct neighbors. We show that MEGA-GNN is not only permutation equivariant but also universal given a strict total ordering on the edges. Experiments show that MEGA-GNN significantly outperforms state-of-the-art solutions by up to 13\% on Anti-Money Laundering datasets and is on par with their accuracy on real-world phishing classification datasets in terms of minority class F1 score.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3138997126</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3138997126</sourcerecordid><originalsourceid>FETCH-proquest_journals_31389971263</originalsourceid><addsrcrecordid>eNqNzLsKwjAYhuEgCBbtPQScAznY0-ihxaXg4F6CxjQlNDV_grdvFS_A6Rveh2-BEi4EI-WO8xVKAQZKKc8LnmUiQU0bbTDay6nHrQKQWuGLBDCjxi8Tenww5GS8ugXjRmnxl5P6PrO91l5p-QmwQcuHtKDS367RtqmvxzOZvHtGBaEbXPTzAXSCibKqCsZz8Z96A603Oy4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3138997126</pqid></control><display><type>article</type><title>Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations</title><source>Free E- Journals</source><creator>Bilgi, H Çağrı ; Chen, Lydia Y ; Atasu, Kubilay</creator><creatorcontrib>Bilgi, H Çağrı ; Chen, Lydia Y ; Atasu, Kubilay</creatorcontrib><description>Graph Neural Networks (GNNs) have seen significant advances in recent years, yet their application to multigraphs, where parallel edges exist between the same pair of nodes, remains under-explored. Standard GNNs, designed for simple graphs, compute node representations by combining all connected edges at once, without distinguishing between edges from different neighbors. There are some GNN architectures proposed specifically for multigraphs, yet these architectures perform only node-level aggregation in their message passing layers, which limits their expressive power. Furthermore, these approaches either lack permutation equivariance when a strict total edge ordering is absent, or fail to preserve the topological structure of the multigraph. To address all these shortcomings, we propose MEGA-GNN, a unified framework for message passing on multigraphs that can effectively perform diverse graph learning tasks. Our approach introduces a two-stage aggregation process in the message passing layers: first, parallel edges are aggregated, followed by a node-level aggregation of messages from distinct neighbors. We show that MEGA-GNN is not only permutation equivariant but also universal given a strict total ordering on the edges. Experiments show that MEGA-GNN significantly outperforms state-of-the-art solutions by up to 13\% on Anti-Money Laundering datasets and is on par with their accuracy on real-world phishing classification datasets in terms of minority class F1 score.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Cognitive tasks ; Graph neural networks ; Graph theory ; Graphical representations ; Machine learning ; Message passing ; Nodes ; Permutations</subject><ispartof>arXiv.org, 2024-12</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>777,781</link.rule.ids></links><search><creatorcontrib>Bilgi, H Çağrı</creatorcontrib><creatorcontrib>Chen, Lydia Y</creatorcontrib><creatorcontrib>Atasu, Kubilay</creatorcontrib><title>Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations</title><title>arXiv.org</title><description>Graph Neural Networks (GNNs) have seen significant advances in recent years, yet their application to multigraphs, where parallel edges exist between the same pair of nodes, remains under-explored. Standard GNNs, designed for simple graphs, compute node representations by combining all connected edges at once, without distinguishing between edges from different neighbors. There are some GNN architectures proposed specifically for multigraphs, yet these architectures perform only node-level aggregation in their message passing layers, which limits their expressive power. Furthermore, these approaches either lack permutation equivariance when a strict total edge ordering is absent, or fail to preserve the topological structure of the multigraph. To address all these shortcomings, we propose MEGA-GNN, a unified framework for message passing on multigraphs that can effectively perform diverse graph learning tasks. Our approach introduces a two-stage aggregation process in the message passing layers: first, parallel edges are aggregated, followed by a node-level aggregation of messages from distinct neighbors. We show that MEGA-GNN is not only permutation equivariant but also universal given a strict total ordering on the edges. Experiments show that MEGA-GNN significantly outperforms state-of-the-art solutions by up to 13\% on Anti-Money Laundering datasets and is on par with their accuracy on real-world phishing classification datasets in terms of minority class F1 score.</description><subject>Cognitive tasks</subject><subject>Graph neural networks</subject><subject>Graph theory</subject><subject>Graphical representations</subject><subject>Machine learning</subject><subject>Message passing</subject><subject>Nodes</subject><subject>Permutations</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNzLsKwjAYhuEgCBbtPQScAznY0-ihxaXg4F6CxjQlNDV_grdvFS_A6Rveh2-BEi4EI-WO8xVKAQZKKc8LnmUiQU0bbTDay6nHrQKQWuGLBDCjxi8Tenww5GS8ugXjRmnxl5P6PrO91l5p-QmwQcuHtKDS367RtqmvxzOZvHtGBaEbXPTzAXSCibKqCsZz8Z96A603Oy4</recordid><startdate>20241210</startdate><enddate>20241210</enddate><creator>Bilgi, H Çağrı</creator><creator>Chen, Lydia Y</creator><creator>Atasu, Kubilay</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20241210</creationdate><title>Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations</title><author>Bilgi, H Çağrı ; Chen, Lydia Y ; Atasu, Kubilay</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_31389971263</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Cognitive tasks</topic><topic>Graph neural networks</topic><topic>Graph theory</topic><topic>Graphical representations</topic><topic>Machine learning</topic><topic>Message passing</topic><topic>Nodes</topic><topic>Permutations</topic><toplevel>online_resources</toplevel><creatorcontrib>Bilgi, H Çağrı</creatorcontrib><creatorcontrib>Chen, Lydia Y</creatorcontrib><creatorcontrib>Atasu, Kubilay</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bilgi, H Çağrı</au><au>Chen, Lydia Y</au><au>Atasu, Kubilay</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations</atitle><jtitle>arXiv.org</jtitle><date>2024-12-10</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Graph Neural Networks (GNNs) have seen significant advances in recent years, yet their application to multigraphs, where parallel edges exist between the same pair of nodes, remains under-explored. Standard GNNs, designed for simple graphs, compute node representations by combining all connected edges at once, without distinguishing between edges from different neighbors. There are some GNN architectures proposed specifically for multigraphs, yet these architectures perform only node-level aggregation in their message passing layers, which limits their expressive power. Furthermore, these approaches either lack permutation equivariance when a strict total edge ordering is absent, or fail to preserve the topological structure of the multigraph. To address all these shortcomings, we propose MEGA-GNN, a unified framework for message passing on multigraphs that can effectively perform diverse graph learning tasks. Our approach introduces a two-stage aggregation process in the message passing layers: first, parallel edges are aggregated, followed by a node-level aggregation of messages from distinct neighbors. We show that MEGA-GNN is not only permutation equivariant but also universal given a strict total ordering on the edges. Experiments show that MEGA-GNN significantly outperforms state-of-the-art solutions by up to 13\% on Anti-Money Laundering datasets and is on par with their accuracy on real-world phishing classification datasets in terms of minority class F1 score.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-12
issn 2331-8422
language eng
recordid cdi_proquest_journals_3138997126
source Free E- Journals
subjects Cognitive tasks
Graph neural networks
Graph theory
Graphical representations
Machine learning
Message passing
Nodes
Permutations
title Multigraph Message Passing with Bi-Directional Multi-Edge Aggregations
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T17%3A47%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Multigraph%20Message%20Passing%20with%20Bi-Directional%20Multi-Edge%20Aggregations&rft.jtitle=arXiv.org&rft.au=Bilgi,%20H%20%C3%87a%C4%9Fr%C4%B1&rft.date=2024-12-10&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3138997126%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3138997126&rft_id=info:pmid/&rfr_iscdi=true