Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation
The prime challenge in unsupervised domain adaptation (DA) is to mitigate the domain shift between the source and target domains. Prior DA works show that pretext tasks could be used to mitigate this domain shift by learning domain invariant representations. However, in practice, we find that most e...
Gespeichert in:
Hauptverfasser: | , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Kundu, Jogendra Nath Bhambri, Suvaansh Kulkarni, Akshay Sarkar, Hiran Jampani, Varun Babu, R. Venkatesh |
description | The prime challenge in unsupervised domain adaptation (DA) is to mitigate the
domain shift between the source and target domains. Prior DA works show that
pretext tasks could be used to mitigate this domain shift by learning domain
invariant representations. However, in practice, we find that most existing
pretext tasks are ineffective against other established techniques. Thus, we
theoretically analyze how and when a subsidiary pretext task could be leveraged
to assist the goal task of a given DA problem and develop objective subsidiary
task suitability criteria. Based on this criteria, we devise a novel process of
sticker intervention and cast sticker classification as a supervised subsidiary
DA problem concurrent to the goal task unsupervised DA. Our approach not only
improves goal task adaptation performance, but also facilitates
privacy-oriented source-free DA i.e. without concurrent source-target access.
Experiments on the standard Office-31, Office-Home, DomainNet, and VisDA
benchmarks demonstrate our superiority for both single-source and multi-source
source-free DA. Our approach also complements existing non-source-free works,
achieving leading performance. |
doi_str_mv | 10.48550/arxiv.2207.13247 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2207_13247</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2207_13247</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-a0d4a8bbc80f8d820ded03adfa553b3d910c77808aa924f83d9978643bbbd7ad3</originalsourceid><addsrcrecordid>eNotj8FOwzAQRH3hgAofwAn_QMLGdmrnWAUKSJWK1HKO1llbskTtyEkq-HtC6WlGo9FoHmMPFZTK1DU8Yf4O51II0GUlhdK37KNNsZ9zdnHih9mOgQLmn8UOLp_DGFLkPmX-Gcdr4ogf0px7V2yzc_w5nTBEviEcJpyW-h278fg1uvurrthx-3Js34rd_vW93ewKXGtdIJBCY21vwBsyAsgRSCSPdS2tpKaCXmsDBrERypslabRZK2mtJY0kV-zxf_aC1A05nJbf3R9ad0GTvyutSy4</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation</title><source>arXiv.org</source><creator>Kundu, Jogendra Nath ; Bhambri, Suvaansh ; Kulkarni, Akshay ; Sarkar, Hiran ; Jampani, Varun ; Babu, R. Venkatesh</creator><creatorcontrib>Kundu, Jogendra Nath ; Bhambri, Suvaansh ; Kulkarni, Akshay ; Sarkar, Hiran ; Jampani, Varun ; Babu, R. Venkatesh</creatorcontrib><description>The prime challenge in unsupervised domain adaptation (DA) is to mitigate the
domain shift between the source and target domains. Prior DA works show that
pretext tasks could be used to mitigate this domain shift by learning domain
invariant representations. However, in practice, we find that most existing
pretext tasks are ineffective against other established techniques. Thus, we
theoretically analyze how and when a subsidiary pretext task could be leveraged
to assist the goal task of a given DA problem and develop objective subsidiary
task suitability criteria. Based on this criteria, we devise a novel process of
sticker intervention and cast sticker classification as a supervised subsidiary
DA problem concurrent to the goal task unsupervised DA. Our approach not only
improves goal task adaptation performance, but also facilitates
privacy-oriented source-free DA i.e. without concurrent source-target access.
Experiments on the standard Office-31, Office-Home, DomainNet, and VisDA
benchmarks demonstrate our superiority for both single-source and multi-source
source-free DA. Our approach also complements existing non-source-free works,
achieving leading performance.</description><identifier>DOI: 10.48550/arxiv.2207.13247</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning</subject><creationdate>2022-07</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2207.13247$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2207.13247$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Kundu, Jogendra Nath</creatorcontrib><creatorcontrib>Bhambri, Suvaansh</creatorcontrib><creatorcontrib>Kulkarni, Akshay</creatorcontrib><creatorcontrib>Sarkar, Hiran</creatorcontrib><creatorcontrib>Jampani, Varun</creatorcontrib><creatorcontrib>Babu, R. Venkatesh</creatorcontrib><title>Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation</title><description>The prime challenge in unsupervised domain adaptation (DA) is to mitigate the
domain shift between the source and target domains. Prior DA works show that
pretext tasks could be used to mitigate this domain shift by learning domain
invariant representations. However, in practice, we find that most existing
pretext tasks are ineffective against other established techniques. Thus, we
theoretically analyze how and when a subsidiary pretext task could be leveraged
to assist the goal task of a given DA problem and develop objective subsidiary
task suitability criteria. Based on this criteria, we devise a novel process of
sticker intervention and cast sticker classification as a supervised subsidiary
DA problem concurrent to the goal task unsupervised DA. Our approach not only
improves goal task adaptation performance, but also facilitates
privacy-oriented source-free DA i.e. without concurrent source-target access.
Experiments on the standard Office-31, Office-Home, DomainNet, and VisDA
benchmarks demonstrate our superiority for both single-source and multi-source
source-free DA. Our approach also complements existing non-source-free works,
achieving leading performance.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8FOwzAQRH3hgAofwAn_QMLGdmrnWAUKSJWK1HKO1llbskTtyEkq-HtC6WlGo9FoHmMPFZTK1DU8Yf4O51II0GUlhdK37KNNsZ9zdnHih9mOgQLmn8UOLp_DGFLkPmX-Gcdr4ogf0px7V2yzc_w5nTBEviEcJpyW-h278fg1uvurrthx-3Js34rd_vW93ewKXGtdIJBCY21vwBsyAsgRSCSPdS2tpKaCXmsDBrERypslabRZK2mtJY0kV-zxf_aC1A05nJbf3R9ad0GTvyutSy4</recordid><startdate>20220726</startdate><enddate>20220726</enddate><creator>Kundu, Jogendra Nath</creator><creator>Bhambri, Suvaansh</creator><creator>Kulkarni, Akshay</creator><creator>Sarkar, Hiran</creator><creator>Jampani, Varun</creator><creator>Babu, R. Venkatesh</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220726</creationdate><title>Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation</title><author>Kundu, Jogendra Nath ; Bhambri, Suvaansh ; Kulkarni, Akshay ; Sarkar, Hiran ; Jampani, Varun ; Babu, R. Venkatesh</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-a0d4a8bbc80f8d820ded03adfa553b3d910c77808aa924f83d9978643bbbd7ad3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Kundu, Jogendra Nath</creatorcontrib><creatorcontrib>Bhambri, Suvaansh</creatorcontrib><creatorcontrib>Kulkarni, Akshay</creatorcontrib><creatorcontrib>Sarkar, Hiran</creatorcontrib><creatorcontrib>Jampani, Varun</creatorcontrib><creatorcontrib>Babu, R. Venkatesh</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kundu, Jogendra Nath</au><au>Bhambri, Suvaansh</au><au>Kulkarni, Akshay</au><au>Sarkar, Hiran</au><au>Jampani, Varun</au><au>Babu, R. Venkatesh</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation</atitle><date>2022-07-26</date><risdate>2022</risdate><abstract>The prime challenge in unsupervised domain adaptation (DA) is to mitigate the
domain shift between the source and target domains. Prior DA works show that
pretext tasks could be used to mitigate this domain shift by learning domain
invariant representations. However, in practice, we find that most existing
pretext tasks are ineffective against other established techniques. Thus, we
theoretically analyze how and when a subsidiary pretext task could be leveraged
to assist the goal task of a given DA problem and develop objective subsidiary
task suitability criteria. Based on this criteria, we devise a novel process of
sticker intervention and cast sticker classification as a supervised subsidiary
DA problem concurrent to the goal task unsupervised DA. Our approach not only
improves goal task adaptation performance, but also facilitates
privacy-oriented source-free DA i.e. without concurrent source-target access.
Experiments on the standard Office-31, Office-Home, DomainNet, and VisDA
benchmarks demonstrate our superiority for both single-source and multi-source
source-free DA. Our approach also complements existing non-source-free works,
achieving leading performance.</abstract><doi>10.48550/arxiv.2207.13247</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.2207.13247 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_2207_13247 |
source | arXiv.org |
subjects | Computer Science - Computer Vision and Pattern Recognition Computer Science - Learning |
title | Concurrent Subsidiary Supervision for Unsupervised Source-Free Domain Adaptation |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-20T20%3A07%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Concurrent%20Subsidiary%20Supervision%20for%20Unsupervised%20Source-Free%20Domain%20Adaptation&rft.au=Kundu,%20Jogendra%20Nath&rft.date=2022-07-26&rft_id=info:doi/10.48550/arxiv.2207.13247&rft_dat=%3Carxiv_GOX%3E2207_13247%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |