Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators
Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identica...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-03 |
---|---|
Hauptverfasser: | , , , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Bai, Sikai Li, Shuaicheng Zhuang, Weiming Zhang, Jie Guo, Song Yang, Kunlin Hou, Jun Zhang, Shuai Gao, Junyu Shuai Yi |
description | Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identically distributed (IID) labeled data across clients and consistent class distribution between labeled and unlabeled data within a client. This work studies a more practical and challenging scenario of FSSL, where data distribution is different not only across clients but also within a client between labeled and unlabeled data. To address this challenge, we propose a novel FSSL framework with dual regulators, FedDure. FedDure lifts the previous assumption with a coarse-grained regulator (C-reg) and a fine-grained regulator (F-reg): C-reg regularizes the updating of the local model by tracking the learning effect on labeled data distribution; F-reg learns an adaptive weighting scheme tailored for unlabeled instances in each client. We further formulate the client model training as bi-level optimization that adaptively optimizes the model in the client with two regulators. Theoretically, we show the convergence guarantee of the dual regulators. Empirically, we demonstrate that FedDure is superior to the existing methods across a wide range of settings, notably by more than 11 on CIFAR-10 and CINIC-10 datasets. |
format | Article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2836090540</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2836090540</sourcerecordid><originalsourceid>FETCH-proquest_journals_28360905403</originalsourceid><addsrcrecordid>eNqNi80KgkAURocgSMp3GGgtTDNqttakoFW5l1vebGQcbX7q9TPoAVodDuf7ZiTgQmyiLOZ8QUJrO8YYT7c8SURAqnzor-CkbmkBDuhxMgX6hpZKTUts0IDDhl6wl5H1I5qXtJOfEIz-vt7SPWjhQdEztl6BG4xdkfkdlMXwxyVZl_sqP0SjGZ4erau7wRs9pZpnImU7lsRM_Lf6AFK-QDo</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2836090540</pqid></control><display><type>article</type><title>Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators</title><source>Free E- Journals</source><creator>Bai, Sikai ; Li, Shuaicheng ; Zhuang, Weiming ; Zhang, Jie ; Guo, Song ; Yang, Kunlin ; Hou, Jun ; Zhang, Shuai ; Gao, Junyu ; Shuai Yi</creator><creatorcontrib>Bai, Sikai ; Li, Shuaicheng ; Zhuang, Weiming ; Zhang, Jie ; Guo, Song ; Yang, Kunlin ; Hou, Jun ; Zhang, Shuai ; Gao, Junyu ; Shuai Yi</creatorcontrib><description>Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identically distributed (IID) labeled data across clients and consistent class distribution between labeled and unlabeled data within a client. This work studies a more practical and challenging scenario of FSSL, where data distribution is different not only across clients but also within a client between labeled and unlabeled data. To address this challenge, we propose a novel FSSL framework with dual regulators, FedDure. FedDure lifts the previous assumption with a coarse-grained regulator (C-reg) and a fine-grained regulator (F-reg): C-reg regularizes the updating of the local model by tracking the learning effect on labeled data distribution; F-reg learns an adaptive weighting scheme tailored for unlabeled instances in each client. We further formulate the client model training as bi-level optimization that adaptively optimizes the model in the client with two regulators. Theoretically, we show the convergence guarantee of the dual regulators. Empirically, we demonstrate that FedDure is superior to the existing methods across a wide range of settings, notably by more than 11 on CIFAR-10 and CINIC-10 datasets.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Clients ; Optimization ; Semi-supervised learning</subject><ispartof>arXiv.org, 2024-03</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Bai, Sikai</creatorcontrib><creatorcontrib>Li, Shuaicheng</creatorcontrib><creatorcontrib>Zhuang, Weiming</creatorcontrib><creatorcontrib>Zhang, Jie</creatorcontrib><creatorcontrib>Guo, Song</creatorcontrib><creatorcontrib>Yang, Kunlin</creatorcontrib><creatorcontrib>Hou, Jun</creatorcontrib><creatorcontrib>Zhang, Shuai</creatorcontrib><creatorcontrib>Gao, Junyu</creatorcontrib><creatorcontrib>Shuai Yi</creatorcontrib><title>Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators</title><title>arXiv.org</title><description>Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identically distributed (IID) labeled data across clients and consistent class distribution between labeled and unlabeled data within a client. This work studies a more practical and challenging scenario of FSSL, where data distribution is different not only across clients but also within a client between labeled and unlabeled data. To address this challenge, we propose a novel FSSL framework with dual regulators, FedDure. FedDure lifts the previous assumption with a coarse-grained regulator (C-reg) and a fine-grained regulator (F-reg): C-reg regularizes the updating of the local model by tracking the learning effect on labeled data distribution; F-reg learns an adaptive weighting scheme tailored for unlabeled instances in each client. We further formulate the client model training as bi-level optimization that adaptively optimizes the model in the client with two regulators. Theoretically, we show the convergence guarantee of the dual regulators. Empirically, we demonstrate that FedDure is superior to the existing methods across a wide range of settings, notably by more than 11 on CIFAR-10 and CINIC-10 datasets.</description><subject>Clients</subject><subject>Optimization</subject><subject>Semi-supervised learning</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNi80KgkAURocgSMp3GGgtTDNqttakoFW5l1vebGQcbX7q9TPoAVodDuf7ZiTgQmyiLOZ8QUJrO8YYT7c8SURAqnzor-CkbmkBDuhxMgX6hpZKTUts0IDDhl6wl5H1I5qXtJOfEIz-vt7SPWjhQdEztl6BG4xdkfkdlMXwxyVZl_sqP0SjGZ4erau7wRs9pZpnImU7lsRM_Lf6AFK-QDo</recordid><startdate>20240311</startdate><enddate>20240311</enddate><creator>Bai, Sikai</creator><creator>Li, Shuaicheng</creator><creator>Zhuang, Weiming</creator><creator>Zhang, Jie</creator><creator>Guo, Song</creator><creator>Yang, Kunlin</creator><creator>Hou, Jun</creator><creator>Zhang, Shuai</creator><creator>Gao, Junyu</creator><creator>Shuai Yi</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240311</creationdate><title>Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators</title><author>Bai, Sikai ; Li, Shuaicheng ; Zhuang, Weiming ; Zhang, Jie ; Guo, Song ; Yang, Kunlin ; Hou, Jun ; Zhang, Shuai ; Gao, Junyu ; Shuai Yi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_28360905403</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Clients</topic><topic>Optimization</topic><topic>Semi-supervised learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Bai, Sikai</creatorcontrib><creatorcontrib>Li, Shuaicheng</creatorcontrib><creatorcontrib>Zhuang, Weiming</creatorcontrib><creatorcontrib>Zhang, Jie</creatorcontrib><creatorcontrib>Guo, Song</creatorcontrib><creatorcontrib>Yang, Kunlin</creatorcontrib><creatorcontrib>Hou, Jun</creatorcontrib><creatorcontrib>Zhang, Shuai</creatorcontrib><creatorcontrib>Gao, Junyu</creatorcontrib><creatorcontrib>Shuai Yi</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Bai, Sikai</au><au>Li, Shuaicheng</au><au>Zhuang, Weiming</au><au>Zhang, Jie</au><au>Guo, Song</au><au>Yang, Kunlin</au><au>Hou, Jun</au><au>Zhang, Shuai</au><au>Gao, Junyu</au><au>Shuai Yi</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators</atitle><jtitle>arXiv.org</jtitle><date>2024-03-11</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Federated learning has become a popular method to learn from decentralized heterogeneous data. Federated semi-supervised learning (FSSL) emerges to train models from a small fraction of labeled data due to label scarcity on decentralized clients. Existing FSSL methods assume independent and identically distributed (IID) labeled data across clients and consistent class distribution between labeled and unlabeled data within a client. This work studies a more practical and challenging scenario of FSSL, where data distribution is different not only across clients but also within a client between labeled and unlabeled data. To address this challenge, we propose a novel FSSL framework with dual regulators, FedDure. FedDure lifts the previous assumption with a coarse-grained regulator (C-reg) and a fine-grained regulator (F-reg): C-reg regularizes the updating of the local model by tracking the learning effect on labeled data distribution; F-reg learns an adaptive weighting scheme tailored for unlabeled instances in each client. We further formulate the client model training as bi-level optimization that adaptively optimizes the model in the client with two regulators. Theoretically, we show the convergence guarantee of the dual regulators. Empirically, we demonstrate that FedDure is superior to the existing methods across a wide range of settings, notably by more than 11 on CIFAR-10 and CINIC-10 datasets.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-03 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_2836090540 |
source | Free E- Journals |
subjects | Clients Optimization Semi-supervised learning |
title | Combating Data Imbalances in Federated Semi-supervised Learning with Dual Regulators |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-10T16%3A44%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Combating%20Data%20Imbalances%20in%20Federated%20Semi-supervised%20Learning%20with%20Dual%20Regulators&rft.jtitle=arXiv.org&rft.au=Bai,%20Sikai&rft.date=2024-03-11&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2836090540%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2836090540&rft_id=info:pmid/&rfr_iscdi=true |