GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification

Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effe...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural networks 2022-07, Vol.151, p.70-79
Hauptverfasser: Ju, Wei, Luo, Xiao, Ma, Zeyu, Yang, Junwei, Deng, Minghua, Zhang, Ming
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 79
container_issue
container_start_page 70
container_title Neural networks
container_volume 151
creator Ju, Wei
Luo, Xiao
Ma, Zeyu
Yang, Junwei
Deng, Minghua
Zhang, Ming
description Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effective node representations via message passing to mine graph topology in an implicit way, whereas graph kernels can explicitly utilize graph structural knowledge for classification. Due to the scarcity of labeled data in real-world applications, semi-supervised algorithms are anticipated for this problem. In this paper, we propose Graph Harmonic Neural Network (GHNN) which combines the advantages of both worlds to sufficiently leverage the unlabeled data, and thus overcomes label scarcity in semi-supervised scenarios. Specifically, our GHNN consists of a graph convolutional network (GCN) module and a graph kernel network (GKN) module that explore graph topology information from complementary perspectives. To fully leverage the unlabeled data, we develop a novel harmonic contrastive loss and a harmonic consistency loss to harmonize the training of two modules by giving priority to high-quality unlabeled data, thereby reconciling prediction consistency between both of them. In this manner, the two modules mutually enhance each other to sufficiently explore the graph topology of both labeled and unlabeled data. Extensive experiments on a variety of benchmarks demonstrate the effectiveness of our approach over competitive baselines.
doi_str_mv 10.1016/j.neunet.2022.03.018
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2649254980</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S089360802200096X</els_id><sourcerecordid>2649254980</sourcerecordid><originalsourceid>FETCH-LOGICAL-c362t-8e640b01507693b5eeaee8b7825ff652f21d135e52ad44dc01ebf6e101fc6c43</originalsourceid><addsrcrecordid>eNp9kEtPwzAQhC0EouXxDxDKkUvC2k5chwMSqqBFqsqB3i3HWYNLHsVOivj3pGrhyGkuMzs7HyFXFBIKVNyukwb7BruEAWMJ8ASoPCJjKid5zCaSHZMxyJzHAiSMyFkIawAQMuWnZMQznksx4WPyOpsvl3fRzOvNezTXvm4bZ6Il9l5Xg3Rfrf8IkW19FLB2ceg36LcuYBm97SJxhVusIlPpEJx1RneubS7IidVVwMuDnpPV0-NqOo8XL7Pn6cMiNlywLpYoUiiAZjAROS8yRI0oi-H1zFqRMctoSXmGGdNlmpYGKBZW4DDeGmFSfk5u9mc3vv3sMXSqdsFgVekG2z4oJtKcZWkuYbCme6vxbQgerdp4V2v_rSioHU21VnuaakdTAVcDzSF2fWjoixrLv9AvvsFwvzfgMHPr0KtgHDYGS-fRdKps3f8NP4hOiFs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2649254980</pqid></control><display><type>article</type><title>GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification</title><source>Elsevier ScienceDirect Journals Complete - AutoHoldings</source><creator>Ju, Wei ; Luo, Xiao ; Ma, Zeyu ; Yang, Junwei ; Deng, Minghua ; Zhang, Ming</creator><creatorcontrib>Ju, Wei ; Luo, Xiao ; Ma, Zeyu ; Yang, Junwei ; Deng, Minghua ; Zhang, Ming</creatorcontrib><description>Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effective node representations via message passing to mine graph topology in an implicit way, whereas graph kernels can explicitly utilize graph structural knowledge for classification. Due to the scarcity of labeled data in real-world applications, semi-supervised algorithms are anticipated for this problem. In this paper, we propose Graph Harmonic Neural Network (GHNN) which combines the advantages of both worlds to sufficiently leverage the unlabeled data, and thus overcomes label scarcity in semi-supervised scenarios. Specifically, our GHNN consists of a graph convolutional network (GCN) module and a graph kernel network (GKN) module that explore graph topology information from complementary perspectives. To fully leverage the unlabeled data, we develop a novel harmonic contrastive loss and a harmonic consistency loss to harmonize the training of two modules by giving priority to high-quality unlabeled data, thereby reconciling prediction consistency between both of them. In this manner, the two modules mutually enhance each other to sufficiently explore the graph topology of both labeled and unlabeled data. Extensive experiments on a variety of benchmarks demonstrate the effectiveness of our approach over competitive baselines.</description><identifier>ISSN: 0893-6080</identifier><identifier>EISSN: 1879-2782</identifier><identifier>DOI: 10.1016/j.neunet.2022.03.018</identifier><identifier>PMID: 35398673</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Graph classification ; Graph kernels ; Graph neural networks ; Semi-supervised learning</subject><ispartof>Neural networks, 2022-07, Vol.151, p.70-79</ispartof><rights>2022 Elsevier Ltd</rights><rights>Copyright © 2022 Elsevier Ltd. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c362t-8e640b01507693b5eeaee8b7825ff652f21d135e52ad44dc01ebf6e101fc6c43</citedby><cites>FETCH-LOGICAL-c362t-8e640b01507693b5eeaee8b7825ff652f21d135e52ad44dc01ebf6e101fc6c43</cites><orcidid>0000-0002-0402-5147 ; 0000-0001-9657-951X ; 0000-0002-2553-0679 ; 0000-0002-9143-1898 ; 0000-0002-7987-3714</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://dx.doi.org/10.1016/j.neunet.2022.03.018$$EHTML$$P50$$Gelsevier$$H</linktohtml><link.rule.ids>314,780,784,3541,27915,27916,45986</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35398673$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Ju, Wei</creatorcontrib><creatorcontrib>Luo, Xiao</creatorcontrib><creatorcontrib>Ma, Zeyu</creatorcontrib><creatorcontrib>Yang, Junwei</creatorcontrib><creatorcontrib>Deng, Minghua</creatorcontrib><creatorcontrib>Zhang, Ming</creatorcontrib><title>GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification</title><title>Neural networks</title><addtitle>Neural Netw</addtitle><description>Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effective node representations via message passing to mine graph topology in an implicit way, whereas graph kernels can explicitly utilize graph structural knowledge for classification. Due to the scarcity of labeled data in real-world applications, semi-supervised algorithms are anticipated for this problem. In this paper, we propose Graph Harmonic Neural Network (GHNN) which combines the advantages of both worlds to sufficiently leverage the unlabeled data, and thus overcomes label scarcity in semi-supervised scenarios. Specifically, our GHNN consists of a graph convolutional network (GCN) module and a graph kernel network (GKN) module that explore graph topology information from complementary perspectives. To fully leverage the unlabeled data, we develop a novel harmonic contrastive loss and a harmonic consistency loss to harmonize the training of two modules by giving priority to high-quality unlabeled data, thereby reconciling prediction consistency between both of them. In this manner, the two modules mutually enhance each other to sufficiently explore the graph topology of both labeled and unlabeled data. Extensive experiments on a variety of benchmarks demonstrate the effectiveness of our approach over competitive baselines.</description><subject>Graph classification</subject><subject>Graph kernels</subject><subject>Graph neural networks</subject><subject>Semi-supervised learning</subject><issn>0893-6080</issn><issn>1879-2782</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kEtPwzAQhC0EouXxDxDKkUvC2k5chwMSqqBFqsqB3i3HWYNLHsVOivj3pGrhyGkuMzs7HyFXFBIKVNyukwb7BruEAWMJ8ASoPCJjKid5zCaSHZMxyJzHAiSMyFkIawAQMuWnZMQznksx4WPyOpsvl3fRzOvNezTXvm4bZ6Il9l5Xg3Rfrf8IkW19FLB2ceg36LcuYBm97SJxhVusIlPpEJx1RneubS7IidVVwMuDnpPV0-NqOo8XL7Pn6cMiNlywLpYoUiiAZjAROS8yRI0oi-H1zFqRMctoSXmGGdNlmpYGKBZW4DDeGmFSfk5u9mc3vv3sMXSqdsFgVekG2z4oJtKcZWkuYbCme6vxbQgerdp4V2v_rSioHU21VnuaakdTAVcDzSF2fWjoixrLv9AvvsFwvzfgMHPr0KtgHDYGS-fRdKps3f8NP4hOiFs</recordid><startdate>202207</startdate><enddate>202207</enddate><creator>Ju, Wei</creator><creator>Luo, Xiao</creator><creator>Ma, Zeyu</creator><creator>Yang, Junwei</creator><creator>Deng, Minghua</creator><creator>Zhang, Ming</creator><general>Elsevier Ltd</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0002-0402-5147</orcidid><orcidid>https://orcid.org/0000-0001-9657-951X</orcidid><orcidid>https://orcid.org/0000-0002-2553-0679</orcidid><orcidid>https://orcid.org/0000-0002-9143-1898</orcidid><orcidid>https://orcid.org/0000-0002-7987-3714</orcidid></search><sort><creationdate>202207</creationdate><title>GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification</title><author>Ju, Wei ; Luo, Xiao ; Ma, Zeyu ; Yang, Junwei ; Deng, Minghua ; Zhang, Ming</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c362t-8e640b01507693b5eeaee8b7825ff652f21d135e52ad44dc01ebf6e101fc6c43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Graph classification</topic><topic>Graph kernels</topic><topic>Graph neural networks</topic><topic>Semi-supervised learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Ju, Wei</creatorcontrib><creatorcontrib>Luo, Xiao</creatorcontrib><creatorcontrib>Ma, Zeyu</creatorcontrib><creatorcontrib>Yang, Junwei</creatorcontrib><creatorcontrib>Deng, Minghua</creatorcontrib><creatorcontrib>Zhang, Ming</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Neural networks</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Ju, Wei</au><au>Luo, Xiao</au><au>Ma, Zeyu</au><au>Yang, Junwei</au><au>Deng, Minghua</au><au>Zhang, Ming</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification</atitle><jtitle>Neural networks</jtitle><addtitle>Neural Netw</addtitle><date>2022-07</date><risdate>2022</risdate><volume>151</volume><spage>70</spage><epage>79</epage><pages>70-79</pages><issn>0893-6080</issn><eissn>1879-2782</eissn><abstract>Graph classification aims to predict the property of the whole graph, which has attracted growing attention in the graph learning community. This problem has been extensively studied in the literature of both graph convolutional networks and graph kernels. Graph convolutional networks can learn effective node representations via message passing to mine graph topology in an implicit way, whereas graph kernels can explicitly utilize graph structural knowledge for classification. Due to the scarcity of labeled data in real-world applications, semi-supervised algorithms are anticipated for this problem. In this paper, we propose Graph Harmonic Neural Network (GHNN) which combines the advantages of both worlds to sufficiently leverage the unlabeled data, and thus overcomes label scarcity in semi-supervised scenarios. Specifically, our GHNN consists of a graph convolutional network (GCN) module and a graph kernel network (GKN) module that explore graph topology information from complementary perspectives. To fully leverage the unlabeled data, we develop a novel harmonic contrastive loss and a harmonic consistency loss to harmonize the training of two modules by giving priority to high-quality unlabeled data, thereby reconciling prediction consistency between both of them. In this manner, the two modules mutually enhance each other to sufficiently explore the graph topology of both labeled and unlabeled data. Extensive experiments on a variety of benchmarks demonstrate the effectiveness of our approach over competitive baselines.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>35398673</pmid><doi>10.1016/j.neunet.2022.03.018</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-0402-5147</orcidid><orcidid>https://orcid.org/0000-0001-9657-951X</orcidid><orcidid>https://orcid.org/0000-0002-2553-0679</orcidid><orcidid>https://orcid.org/0000-0002-9143-1898</orcidid><orcidid>https://orcid.org/0000-0002-7987-3714</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0893-6080
ispartof Neural networks, 2022-07, Vol.151, p.70-79
issn 0893-6080
1879-2782
language eng
recordid cdi_proquest_miscellaneous_2649254980
source Elsevier ScienceDirect Journals Complete - AutoHoldings
subjects Graph classification
Graph kernels
Graph neural networks
Semi-supervised learning
title GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T21%3A38%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=GHNN:%20Graph%20Harmonic%20Neural%20Networks%20for%20semi-supervised%20graph-level%20classification&rft.jtitle=Neural%20networks&rft.au=Ju,%20Wei&rft.date=2022-07&rft.volume=151&rft.spage=70&rft.epage=79&rft.pages=70-79&rft.issn=0893-6080&rft.eissn=1879-2782&rft_id=info:doi/10.1016/j.neunet.2022.03.018&rft_dat=%3Cproquest_cross%3E2649254980%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2649254980&rft_id=info:pmid/35398673&rft_els_id=S089360802200096X&rfr_iscdi=true