Self-Cooperation Knowledge Distillation for Novel Class Discovery

Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Wang, Yuzheng, Chen, Zhaoyu, Yang, Dingkang, Sun, Yunquan, Qi, Lizhe
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Wang, Yuzheng
Chen, Zhaoyu
Yang, Dingkang
Sun, Yunquan
Qi, Lizhe
description Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improvements. However, a long-neglected issue is the potential imbalanced number of samples from known and novel classes, pushing the model towards dominant classes. Therefore, these methods suffer from a challenging trade-off between reviewing known classes and discovering novel classes. Based on this observation, we propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery. Specifically, the model's feature representations of known and novel classes are used to construct two disjoint representation spaces. Through spatial mutual information, we design a self-cooperation learning to encourage model learning from the two feature representation spaces from itself. Extensive experiments on six datasets demonstrate that our method can achieve significant performance improvements, achieving state-of-the-art performance.
doi_str_mv 10.48550/arxiv.2407.01930
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2407_01930</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2407_01930</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2407_019303</originalsourceid><addsrcrecordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjEw1zMwtDQ24GRwDE7NSdN1zs8vSC1KLMnMz1Pwzssvz0lNSU9VcMksLsnMyYEIp-UXKfjll6XmKDjnJBYXgySTgdyiSh4G1rTEnOJUXijNzSDv5hri7KELtiy-oCgzN7GoMh5kaTzYUmPCKgBMIjdG</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Self-Cooperation Knowledge Distillation for Novel Class Discovery</title><source>arXiv.org</source><creator>Wang, Yuzheng ; Chen, Zhaoyu ; Yang, Dingkang ; Sun, Yunquan ; Qi, Lizhe</creator><creatorcontrib>Wang, Yuzheng ; Chen, Zhaoyu ; Yang, Dingkang ; Sun, Yunquan ; Qi, Lizhe</creatorcontrib><description>Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improvements. However, a long-neglected issue is the potential imbalanced number of samples from known and novel classes, pushing the model towards dominant classes. Therefore, these methods suffer from a challenging trade-off between reviewing known classes and discovering novel classes. Based on this observation, we propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery. Specifically, the model's feature representations of known and novel classes are used to construct two disjoint representation spaces. Through spatial mutual information, we design a self-cooperation learning to encourage model learning from the two feature representation spaces from itself. Extensive experiments on six datasets demonstrate that our method can achieve significant performance improvements, achieving state-of-the-art performance.</description><identifier>DOI: 10.48550/arxiv.2407.01930</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition</subject><creationdate>2024-07</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2407.01930$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2407.01930$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Wang, Yuzheng</creatorcontrib><creatorcontrib>Chen, Zhaoyu</creatorcontrib><creatorcontrib>Yang, Dingkang</creatorcontrib><creatorcontrib>Sun, Yunquan</creatorcontrib><creatorcontrib>Qi, Lizhe</creatorcontrib><title>Self-Cooperation Knowledge Distillation for Novel Class Discovery</title><description>Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improvements. However, a long-neglected issue is the potential imbalanced number of samples from known and novel classes, pushing the model towards dominant classes. Therefore, these methods suffer from a challenging trade-off between reviewing known classes and discovering novel classes. Based on this observation, we propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery. Specifically, the model's feature representations of known and novel classes are used to construct two disjoint representation spaces. Through spatial mutual information, we design a self-cooperation learning to encourage model learning from the two feature representation spaces from itself. Extensive experiments on six datasets demonstrate that our method can achieve significant performance improvements, achieving state-of-the-art performance.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjEw1zMwtDQ24GRwDE7NSdN1zs8vSC1KLMnMz1Pwzssvz0lNSU9VcMksLsnMyYEIp-UXKfjll6XmKDjnJBYXgySTgdyiSh4G1rTEnOJUXijNzSDv5hri7KELtiy-oCgzN7GoMh5kaTzYUmPCKgBMIjdG</recordid><startdate>20240701</startdate><enddate>20240701</enddate><creator>Wang, Yuzheng</creator><creator>Chen, Zhaoyu</creator><creator>Yang, Dingkang</creator><creator>Sun, Yunquan</creator><creator>Qi, Lizhe</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20240701</creationdate><title>Self-Cooperation Knowledge Distillation for Novel Class Discovery</title><author>Wang, Yuzheng ; Chen, Zhaoyu ; Yang, Dingkang ; Sun, Yunquan ; Qi, Lizhe</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2407_019303</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><toplevel>online_resources</toplevel><creatorcontrib>Wang, Yuzheng</creatorcontrib><creatorcontrib>Chen, Zhaoyu</creatorcontrib><creatorcontrib>Yang, Dingkang</creatorcontrib><creatorcontrib>Sun, Yunquan</creatorcontrib><creatorcontrib>Qi, Lizhe</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Yuzheng</au><au>Chen, Zhaoyu</au><au>Yang, Dingkang</au><au>Sun, Yunquan</au><au>Qi, Lizhe</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Self-Cooperation Knowledge Distillation for Novel Class Discovery</atitle><date>2024-07-01</date><risdate>2024</risdate><abstract>Novel Class Discovery (NCD) aims to discover unknown and novel classes in an unlabeled set by leveraging knowledge already learned about known classes. Existing works focus on instance-level or class-level knowledge representation and build a shared representation space to achieve performance improvements. However, a long-neglected issue is the potential imbalanced number of samples from known and novel classes, pushing the model towards dominant classes. Therefore, these methods suffer from a challenging trade-off between reviewing known classes and discovering novel classes. Based on this observation, we propose a Self-Cooperation Knowledge Distillation (SCKD) method to utilize each training sample (whether known or novel, labeled or unlabeled) for both review and discovery. Specifically, the model's feature representations of known and novel classes are used to construct two disjoint representation spaces. Through spatial mutual information, we design a self-cooperation learning to encourage model learning from the two feature representation spaces from itself. Extensive experiments on six datasets demonstrate that our method can achieve significant performance improvements, achieving state-of-the-art performance.</abstract><doi>10.48550/arxiv.2407.01930</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2407.01930
ispartof
issn
language eng
recordid cdi_arxiv_primary_2407_01930
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
title Self-Cooperation Knowledge Distillation for Novel Class Discovery
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T07%3A35%3A03IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Self-Cooperation%20Knowledge%20Distillation%20for%20Novel%20Class%20Discovery&rft.au=Wang,%20Yuzheng&rft.date=2024-07-01&rft_id=info:doi/10.48550/arxiv.2407.01930&rft_dat=%3Carxiv_GOX%3E2407_01930%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true