Class-rebalanced wasserstein distance for multi-source domain adaptation

In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023-04, Vol.53 (7), p.8024-8038
Hauptverfasser: Wang, Qi, Wang, Shengsheng, Wang, Bilin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 8038
container_issue 7
container_start_page 8024
container_title Applied intelligence (Dordrecht, Netherlands)
container_volume 53
creator Wang, Qi
Wang, Shengsheng
Wang, Bilin
description In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain alignment, neglecting the intrinsic structure of label distribution that can harm the performance of adaptation. Therefore, to make better use of label information when aligning joint feature-label distribution, we propose a rebalancing scheme, class-rebalanced Wasserstein distance (CRWD), for unsupervised MSDA under class-wise imbalance and data correlation. Based on the optimal transport for domain adaptation (OTDA) framework, CRWD mitigates the impact of the biased label structure by rectifying the Wasserstein mapping from source to target space. Technically, the class proportions are utilized to encourage distributional transportation between minor classes and principal components, which reweigh the optimal transport plan and reinforce the ground metric of Mahalanobis distance to better metricise the differences among domains. In addition, the scheme measures both inter-domain and intra-source discrepancies to enhance adaptation. Extensive experiments are conducted on various benchmarks, and the results prove that CRWD has competitive advantages.
doi_str_mv 10.1007/s10489-022-03810-y
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2787065607</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2787065607</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-6015c8d2dcb6e6fa8e7a50ca2316302ea8b7d05ff60b264f1d5aa43f9a1a34ce3</originalsourceid><addsrcrecordid>eNp9kE9LxDAQxYMouK5-AU8Fz9FJ0ibtURZ1hQUvCt7CNH-kS7ddkyzSb2_WCt48DfP4vTfDI-SawS0DUHeRQVk3FDinIGoGdDohC1YpQVXZqFOygIaXVMrm_ZxcxLgFACGALch61WOMNLgWexyMs8VX3l2IyXVDYbuYjmrhx1DsDn3qaBwPIQt23GEG0OI-YerG4ZKceeyju_qdS_L2-PC6WtPNy9Pz6n5DjWBNohJYZWrLrWmlkx5rp7ACg1wwKYA7rFtlofJeQstl6ZmtEEvhG2QoSuPEktzMufswfh5cTHqbPxrySc1VrUBWElSm-EyZMMYYnNf70O0wTJqBPjam58Z0bkz_NKanbBKzKWZ4-HDhL_of1ze0EHA1</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2787065607</pqid></control><display><type>article</type><title>Class-rebalanced wasserstein distance for multi-source domain adaptation</title><source>SpringerLink Journals - AutoHoldings</source><creator>Wang, Qi ; Wang, Shengsheng ; Wang, Bilin</creator><creatorcontrib>Wang, Qi ; Wang, Shengsheng ; Wang, Bilin</creatorcontrib><description>In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain alignment, neglecting the intrinsic structure of label distribution that can harm the performance of adaptation. Therefore, to make better use of label information when aligning joint feature-label distribution, we propose a rebalancing scheme, class-rebalanced Wasserstein distance (CRWD), for unsupervised MSDA under class-wise imbalance and data correlation. Based on the optimal transport for domain adaptation (OTDA) framework, CRWD mitigates the impact of the biased label structure by rectifying the Wasserstein mapping from source to target space. Technically, the class proportions are utilized to encourage distributional transportation between minor classes and principal components, which reweigh the optimal transport plan and reinforce the ground metric of Mahalanobis distance to better metricise the differences among domains. In addition, the scheme measures both inter-domain and intra-source discrepancies to enhance adaptation. Extensive experiments are conducted on various benchmarks, and the results prove that CRWD has competitive advantages.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-022-03810-y</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Adaptation ; Artificial Intelligence ; Computer Science ; Data correlation ; Datasets ; Domains ; Feature extraction ; Generators ; Knowledge ; Machine learning ; Machines ; Manufacturing ; Mechanical Engineering ; Processes ; Transportation planning</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2023-04, Vol.53 (7), p.8024-8038</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-6015c8d2dcb6e6fa8e7a50ca2316302ea8b7d05ff60b264f1d5aa43f9a1a34ce3</citedby><cites>FETCH-LOGICAL-c319t-6015c8d2dcb6e6fa8e7a50ca2316302ea8b7d05ff60b264f1d5aa43f9a1a34ce3</cites><orcidid>0000-0002-8503-8061</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-022-03810-y$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-022-03810-y$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Wang, Qi</creatorcontrib><creatorcontrib>Wang, Shengsheng</creatorcontrib><creatorcontrib>Wang, Bilin</creatorcontrib><title>Class-rebalanced wasserstein distance for multi-source domain adaptation</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain alignment, neglecting the intrinsic structure of label distribution that can harm the performance of adaptation. Therefore, to make better use of label information when aligning joint feature-label distribution, we propose a rebalancing scheme, class-rebalanced Wasserstein distance (CRWD), for unsupervised MSDA under class-wise imbalance and data correlation. Based on the optimal transport for domain adaptation (OTDA) framework, CRWD mitigates the impact of the biased label structure by rectifying the Wasserstein mapping from source to target space. Technically, the class proportions are utilized to encourage distributional transportation between minor classes and principal components, which reweigh the optimal transport plan and reinforce the ground metric of Mahalanobis distance to better metricise the differences among domains. In addition, the scheme measures both inter-domain and intra-source discrepancies to enhance adaptation. Extensive experiments are conducted on various benchmarks, and the results prove that CRWD has competitive advantages.</description><subject>Adaptation</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Data correlation</subject><subject>Datasets</subject><subject>Domains</subject><subject>Feature extraction</subject><subject>Generators</subject><subject>Knowledge</subject><subject>Machine learning</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Processes</subject><subject>Transportation planning</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE9LxDAQxYMouK5-AU8Fz9FJ0ibtURZ1hQUvCt7CNH-kS7ddkyzSb2_WCt48DfP4vTfDI-SawS0DUHeRQVk3FDinIGoGdDohC1YpQVXZqFOygIaXVMrm_ZxcxLgFACGALch61WOMNLgWexyMs8VX3l2IyXVDYbuYjmrhx1DsDn3qaBwPIQt23GEG0OI-YerG4ZKceeyju_qdS_L2-PC6WtPNy9Pz6n5DjWBNohJYZWrLrWmlkx5rp7ACg1wwKYA7rFtlofJeQstl6ZmtEEvhG2QoSuPEktzMufswfh5cTHqbPxrySc1VrUBWElSm-EyZMMYYnNf70O0wTJqBPjam58Z0bkz_NKanbBKzKWZ4-HDhL_of1ze0EHA1</recordid><startdate>20230401</startdate><enddate>20230401</enddate><creator>Wang, Qi</creator><creator>Wang, Shengsheng</creator><creator>Wang, Bilin</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-8503-8061</orcidid></search><sort><creationdate>20230401</creationdate><title>Class-rebalanced wasserstein distance for multi-source domain adaptation</title><author>Wang, Qi ; Wang, Shengsheng ; Wang, Bilin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-6015c8d2dcb6e6fa8e7a50ca2316302ea8b7d05ff60b264f1d5aa43f9a1a34ce3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Adaptation</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Data correlation</topic><topic>Datasets</topic><topic>Domains</topic><topic>Feature extraction</topic><topic>Generators</topic><topic>Knowledge</topic><topic>Machine learning</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Processes</topic><topic>Transportation planning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Qi</creatorcontrib><creatorcontrib>Wang, Shengsheng</creatorcontrib><creatorcontrib>Wang, Bilin</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wang, Qi</au><au>Wang, Shengsheng</au><au>Wang, Bilin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Class-rebalanced wasserstein distance for multi-source domain adaptation</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2023-04-01</date><risdate>2023</risdate><volume>53</volume><issue>7</issue><spage>8024</spage><epage>8038</epage><pages>8024-8038</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>In the study of machine learning, multi-source domain adaptation (MSDA) handles multiple datasets which are collected from different distributions by using domain-invariant knowledge extraction. However, the current studies mainly employ features and raw labels on the joint space to perform domain alignment, neglecting the intrinsic structure of label distribution that can harm the performance of adaptation. Therefore, to make better use of label information when aligning joint feature-label distribution, we propose a rebalancing scheme, class-rebalanced Wasserstein distance (CRWD), for unsupervised MSDA under class-wise imbalance and data correlation. Based on the optimal transport for domain adaptation (OTDA) framework, CRWD mitigates the impact of the biased label structure by rectifying the Wasserstein mapping from source to target space. Technically, the class proportions are utilized to encourage distributional transportation between minor classes and principal components, which reweigh the optimal transport plan and reinforce the ground metric of Mahalanobis distance to better metricise the differences among domains. In addition, the scheme measures both inter-domain and intra-source discrepancies to enhance adaptation. Extensive experiments are conducted on various benchmarks, and the results prove that CRWD has competitive advantages.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-022-03810-y</doi><tpages>15</tpages><orcidid>https://orcid.org/0000-0002-8503-8061</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0924-669X
ispartof Applied intelligence (Dordrecht, Netherlands), 2023-04, Vol.53 (7), p.8024-8038
issn 0924-669X
1573-7497
language eng
recordid cdi_proquest_journals_2787065607
source SpringerLink Journals - AutoHoldings
subjects Adaptation
Artificial Intelligence
Computer Science
Data correlation
Datasets
Domains
Feature extraction
Generators
Knowledge
Machine learning
Machines
Manufacturing
Mechanical Engineering
Processes
Transportation planning
title Class-rebalanced wasserstein distance for multi-source domain adaptation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-21T03%3A42%3A35IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Class-rebalanced%20wasserstein%20distance%20for%20multi-source%20domain%20adaptation&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Wang,%20Qi&rft.date=2023-04-01&rft.volume=53&rft.issue=7&rft.spage=8024&rft.epage=8038&rft.pages=8024-8038&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-022-03810-y&rft_dat=%3Cproquest_cross%3E2787065607%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2787065607&rft_id=info:pmid/&rfr_iscdi=true