Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts
Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed f...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on big data 2023-12, Vol.9 (6), p.1683-1696 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 1696 |
---|---|
container_issue | 6 |
container_start_page | 1683 |
container_title | IEEE transactions on big data |
container_volume | 9 |
creator | Yi, Si-Yu Mao, Zhengyang Ju, Wei Zhou, Yong-Dao Liu, Luchen Luo, Xiao Zhang, Ming |
description | Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed form, where the head classes occupy much more samples than the tail classes, it thus is essential to study the graph-level classification over long-tailed data while still remaining largely unexplored. However, most existing long-tailed learning methods in visions fail to jointly optimize the representation learning and classifier training, as well as neglect the mining of the hard-to-classify classes. Directly applying existing methods to graphs may lead to sub-optimal performance, since the model trained on graphs would be more sensitive to the long-tailed distribution due to the complex topological characteristics. Hence, in this paper, we propose a novel long-tailed graph-level classification framework via Co llaborative M ulti- e xpert Learning (CoMe) to tackle the problem. To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning, and then design an individual-expert classifier training based on hard class mining. In addition, we execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework. Comprehensive experiments are performed on seven widely-used benchmark datasets to demonstrate the superiority of our method CoMe over state-of-the-art baselines. |
doi_str_mv | 10.1109/TBDATA.2023.3313029 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_2889731553</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10243111</ieee_id><sourcerecordid>2889731553</sourcerecordid><originalsourceid>FETCH-LOGICAL-c2589-990c3314b3f3f97be912872fdbdf118256e30a7edcd8262ba8f48f869624f4bc3</originalsourceid><addsrcrecordid>eNpNkE9PwzAMxSMEEtPYJ4BDJc4did0_yXEUGEiTkKZyjtI2GZlKU5JtwLenoztwsmW_52f9CLlmdM4YFXfl_cOiXMyBAs4RGVIQZ2QCmEMMVGTnxx4hznNBL8kshC2llGWUooAJWZfuS_kmRCvXbeJS2VY30VrXbtPZnXVdZJyPll7171HRqhCssbX6WxysigrXtqpyfpgcdPT43Wu_C1fkwqg26NmpTsnb02NZPMer1-VLsVjFNaRcxELQeng3qdCgEXmlBQOeg2mqxjDGIc00UpXrpm44ZFApbhJueCYySExS1Tglt-Pd3rvPvQ47uXV73w2REjgXObI0xUGFo6r2LgSvjey9_VD-RzIqj_zkyE8e-ckTv8F1M7qs1vqfAxJkjOEv6s9r5g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2889731553</pqid></control><display><type>article</type><title>Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts</title><source>IEEE Electronic Library (IEL)</source><creator>Yi, Si-Yu ; Mao, Zhengyang ; Ju, Wei ; Zhou, Yong-Dao ; Liu, Luchen ; Luo, Xiao ; Zhang, Ming</creator><creatorcontrib>Yi, Si-Yu ; Mao, Zhengyang ; Ju, Wei ; Zhou, Yong-Dao ; Liu, Luchen ; Luo, Xiao ; Zhang, Ming</creatorcontrib><description>Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed form, where the head classes occupy much more samples than the tail classes, it thus is essential to study the graph-level classification over long-tailed data while still remaining largely unexplored. However, most existing long-tailed learning methods in visions fail to jointly optimize the representation learning and classifier training, as well as neglect the mining of the hard-to-classify classes. Directly applying existing methods to graphs may lead to sub-optimal performance, since the model trained on graphs would be more sensitive to the long-tailed distribution due to the complex topological characteristics. Hence, in this paper, we propose a novel long-tailed graph-level classification framework via Co llaborative M ulti- e xpert Learning (CoMe) to tackle the problem. To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning, and then design an individual-expert classifier training based on hard class mining. In addition, we execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework. Comprehensive experiments are performed on seven widely-used benchmark datasets to demonstrate the superiority of our method CoMe over state-of-the-art baselines.</description><identifier>ISSN: 2332-7790</identifier><identifier>EISSN: 2372-2096</identifier><identifier>DOI: 10.1109/TBDATA.2023.3313029</identifier><identifier>CODEN: ITBDAX</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Balanced contrastive learning ; Balancing ; class-imbalanced learning ; Classification ; Classifiers ; Collaboration ; Datasets ; Distillation ; Ensemble learning ; Graphical representations ; Graphs ; hard class extraction ; Machine learning ; multi-expert learning ; Optimization ; Predictive models ; Representation learning ; Tail ; Task analysis ; Training</subject><ispartof>IEEE transactions on big data, 2023-12, Vol.9 (6), p.1683-1696</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c2589-990c3314b3f3f97be912872fdbdf118256e30a7edcd8262ba8f48f869624f4bc3</citedby><cites>FETCH-LOGICAL-c2589-990c3314b3f3f97be912872fdbdf118256e30a7edcd8262ba8f48f869624f4bc3</cites><orcidid>0000-0001-5124-2382 ; 0009-0009-3817-6267 ; 0000-0002-9809-3430 ; 0000-0001-9657-951X ; 0000-0002-7987-3714 ; 0000-0002-2277-6008 ; 0000-0003-3805-7021</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10243111$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27903,27904,54737</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10243111$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Yi, Si-Yu</creatorcontrib><creatorcontrib>Mao, Zhengyang</creatorcontrib><creatorcontrib>Ju, Wei</creatorcontrib><creatorcontrib>Zhou, Yong-Dao</creatorcontrib><creatorcontrib>Liu, Luchen</creatorcontrib><creatorcontrib>Luo, Xiao</creatorcontrib><creatorcontrib>Zhang, Ming</creatorcontrib><title>Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts</title><title>IEEE transactions on big data</title><addtitle>TBData</addtitle><description>Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed form, where the head classes occupy much more samples than the tail classes, it thus is essential to study the graph-level classification over long-tailed data while still remaining largely unexplored. However, most existing long-tailed learning methods in visions fail to jointly optimize the representation learning and classifier training, as well as neglect the mining of the hard-to-classify classes. Directly applying existing methods to graphs may lead to sub-optimal performance, since the model trained on graphs would be more sensitive to the long-tailed distribution due to the complex topological characteristics. Hence, in this paper, we propose a novel long-tailed graph-level classification framework via Co llaborative M ulti- e xpert Learning (CoMe) to tackle the problem. To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning, and then design an individual-expert classifier training based on hard class mining. In addition, we execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework. Comprehensive experiments are performed on seven widely-used benchmark datasets to demonstrate the superiority of our method CoMe over state-of-the-art baselines.</description><subject>Balanced contrastive learning</subject><subject>Balancing</subject><subject>class-imbalanced learning</subject><subject>Classification</subject><subject>Classifiers</subject><subject>Collaboration</subject><subject>Datasets</subject><subject>Distillation</subject><subject>Ensemble learning</subject><subject>Graphical representations</subject><subject>Graphs</subject><subject>hard class extraction</subject><subject>Machine learning</subject><subject>multi-expert learning</subject><subject>Optimization</subject><subject>Predictive models</subject><subject>Representation learning</subject><subject>Tail</subject><subject>Task analysis</subject><subject>Training</subject><issn>2332-7790</issn><issn>2372-2096</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE9PwzAMxSMEEtPYJ4BDJc4did0_yXEUGEiTkKZyjtI2GZlKU5JtwLenoztwsmW_52f9CLlmdM4YFXfl_cOiXMyBAs4RGVIQZ2QCmEMMVGTnxx4hznNBL8kshC2llGWUooAJWZfuS_kmRCvXbeJS2VY30VrXbtPZnXVdZJyPll7171HRqhCssbX6WxysigrXtqpyfpgcdPT43Wu_C1fkwqg26NmpTsnb02NZPMer1-VLsVjFNaRcxELQeng3qdCgEXmlBQOeg2mqxjDGIc00UpXrpm44ZFApbhJueCYySExS1Tglt-Pd3rvPvQ47uXV73w2REjgXObI0xUGFo6r2LgSvjey9_VD-RzIqj_zkyE8e-ckTv8F1M7qs1vqfAxJkjOEv6s9r5g</recordid><startdate>20231201</startdate><enddate>20231201</enddate><creator>Yi, Si-Yu</creator><creator>Mao, Zhengyang</creator><creator>Ju, Wei</creator><creator>Zhou, Yong-Dao</creator><creator>Liu, Luchen</creator><creator>Luo, Xiao</creator><creator>Zhang, Ming</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0001-5124-2382</orcidid><orcidid>https://orcid.org/0009-0009-3817-6267</orcidid><orcidid>https://orcid.org/0000-0002-9809-3430</orcidid><orcidid>https://orcid.org/0000-0001-9657-951X</orcidid><orcidid>https://orcid.org/0000-0002-7987-3714</orcidid><orcidid>https://orcid.org/0000-0002-2277-6008</orcidid><orcidid>https://orcid.org/0000-0003-3805-7021</orcidid></search><sort><creationdate>20231201</creationdate><title>Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts</title><author>Yi, Si-Yu ; Mao, Zhengyang ; Ju, Wei ; Zhou, Yong-Dao ; Liu, Luchen ; Luo, Xiao ; Zhang, Ming</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c2589-990c3314b3f3f97be912872fdbdf118256e30a7edcd8262ba8f48f869624f4bc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Balanced contrastive learning</topic><topic>Balancing</topic><topic>class-imbalanced learning</topic><topic>Classification</topic><topic>Classifiers</topic><topic>Collaboration</topic><topic>Datasets</topic><topic>Distillation</topic><topic>Ensemble learning</topic><topic>Graphical representations</topic><topic>Graphs</topic><topic>hard class extraction</topic><topic>Machine learning</topic><topic>multi-expert learning</topic><topic>Optimization</topic><topic>Predictive models</topic><topic>Representation learning</topic><topic>Tail</topic><topic>Task analysis</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yi, Si-Yu</creatorcontrib><creatorcontrib>Mao, Zhengyang</creatorcontrib><creatorcontrib>Ju, Wei</creatorcontrib><creatorcontrib>Zhou, Yong-Dao</creatorcontrib><creatorcontrib>Liu, Luchen</creatorcontrib><creatorcontrib>Luo, Xiao</creatorcontrib><creatorcontrib>Zhang, Ming</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on big data</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Yi, Si-Yu</au><au>Mao, Zhengyang</au><au>Ju, Wei</au><au>Zhou, Yong-Dao</au><au>Liu, Luchen</au><au>Luo, Xiao</au><au>Zhang, Ming</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts</atitle><jtitle>IEEE transactions on big data</jtitle><stitle>TBData</stitle><date>2023-12-01</date><risdate>2023</risdate><volume>9</volume><issue>6</issue><spage>1683</spage><epage>1696</epage><pages>1683-1696</pages><issn>2332-7790</issn><eissn>2372-2096</eissn><coden>ITBDAX</coden><abstract>Graph classification, aiming at learning the graph-level representations for effective class assignments, has received outstanding achievements, which heavily relies on high-quality datasets that have balanced class distribution. In fact, most real-world graph data naturally presents a long-tailed form, where the head classes occupy much more samples than the tail classes, it thus is essential to study the graph-level classification over long-tailed data while still remaining largely unexplored. However, most existing long-tailed learning methods in visions fail to jointly optimize the representation learning and classifier training, as well as neglect the mining of the hard-to-classify classes. Directly applying existing methods to graphs may lead to sub-optimal performance, since the model trained on graphs would be more sensitive to the long-tailed distribution due to the complex topological characteristics. Hence, in this paper, we propose a novel long-tailed graph-level classification framework via Co llaborative M ulti- e xpert Learning (CoMe) to tackle the problem. To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning, and then design an individual-expert classifier training based on hard class mining. In addition, we execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework. Comprehensive experiments are performed on seven widely-used benchmark datasets to demonstrate the superiority of our method CoMe over state-of-the-art baselines.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/TBDATA.2023.3313029</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0001-5124-2382</orcidid><orcidid>https://orcid.org/0009-0009-3817-6267</orcidid><orcidid>https://orcid.org/0000-0002-9809-3430</orcidid><orcidid>https://orcid.org/0000-0001-9657-951X</orcidid><orcidid>https://orcid.org/0000-0002-7987-3714</orcidid><orcidid>https://orcid.org/0000-0002-2277-6008</orcidid><orcidid>https://orcid.org/0000-0003-3805-7021</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 2332-7790 |
ispartof | IEEE transactions on big data, 2023-12, Vol.9 (6), p.1683-1696 |
issn | 2332-7790 2372-2096 |
language | eng |
recordid | cdi_proquest_journals_2889731553 |
source | IEEE Electronic Library (IEL) |
subjects | Balanced contrastive learning Balancing class-imbalanced learning Classification Classifiers Collaboration Datasets Distillation Ensemble learning Graphical representations Graphs hard class extraction Machine learning multi-expert learning Optimization Predictive models Representation learning Tail Task analysis Training |
title | Towards Long-Tailed Recognition for Graph Classification via Collaborative Experts |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-22T06%3A11%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20Long-Tailed%20Recognition%20for%20Graph%20Classification%20via%20Collaborative%20Experts&rft.jtitle=IEEE%20transactions%20on%20big%20data&rft.au=Yi,%20Si-Yu&rft.date=2023-12-01&rft.volume=9&rft.issue=6&rft.spage=1683&rft.epage=1696&rft.pages=1683-1696&rft.issn=2332-7790&rft.eissn=2372-2096&rft.coden=ITBDAX&rft_id=info:doi/10.1109/TBDATA.2023.3313029&rft_dat=%3Cproquest_RIE%3E2889731553%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2889731553&rft_id=info:pmid/&rft_ieee_id=10243111&rfr_iscdi=true |