Contribution‐based Federated Learning client selection

Federated Learning (FL), as a privacy‐preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanil...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:International journal of intelligent systems 2022-10, Vol.37 (10), p.7235-7260
Hauptverfasser: Lin, Weiwei, Xu, Yinhai, Liu, Bo, Li, Dongdong, Huang, Tiansheng, Shi, Fang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 7260
container_issue 10
container_start_page 7235
container_title International journal of intelligent systems
container_volume 37
creator Lin, Weiwei
Xu, Yinhai
Liu, Bo
Li, Dongdong
Huang, Tiansheng
Shi, Fang
description Federated Learning (FL), as a privacy‐preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanilla random selection) typically ignore the heterogeneous data value of the clients. In this paper, we propose the contribution‐based selection algorithm (Contribution‐Based Exponential‐weight algorithm for Exploration and Exploitation, CBE3), which dynamically updates the selection weights according to the impact of clients' data. As a novel component of CBE3, a scaling factor, which helps maintain a good balance between global model accuracy and convergence speed, is proposed to improve the algorithm's adaptability. Theoretically, we proved the regret bound of the proposed CBE3 algorithm, which demonstrates performance gaps between the CBE3 and the optimal choice. Empirically, extensive experiments conducted on Non‐Independent Identically Distributed data demonstrate the superior performance of CBE3—with up to 10% accuracy improvement compared with K‐Center and Greedy and up to 100% faster convergence compared with the Random algorithm.
doi_str_mv 10.1002/int.22879
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2706177206</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2706177206</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3329-30c588009699069d5e017836c2238de0849445907c8a7cc0bc42ad6185220eb53</originalsourceid><addsrcrecordid>eNp10LFOwzAQBmALgUQpDLxBJSaGtGc7ic8jqiggVbAUic1ynCtyFZxip0LdeASekSchJaxMd8P330k_Y5ccphxAzHzopkKg0kdsxEFjxjl_OWYjQMwz5EqesrOUNgCcq7wYMZy3oYu-2nW-Dd-fX5VNVE8WVFO0Xb8tycbgw-vENZ5CN0nUkDvYc3aytk2ii785Zs-L29X8Pls-3T3Mb5aZk1LoTIIrEAF0qTWUui4IuEJZOiEk1gSY6zwvNCiHVjkHlcuFrUuOhRBAVSHH7Gq4u43t-45SZzbtLob-pREKSq6UgLJX14NysU0p0tpso3-zcW84mEMxpi_G_BbT29lgP3xD-_-heXhcDYkfPXtjnw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2706177206</pqid></control><display><type>article</type><title>Contribution‐based Federated Learning client selection</title><source>Wiley Online Library Journals Frontfile Complete</source><creator>Lin, Weiwei ; Xu, Yinhai ; Liu, Bo ; Li, Dongdong ; Huang, Tiansheng ; Shi, Fang</creator><creatorcontrib>Lin, Weiwei ; Xu, Yinhai ; Liu, Bo ; Li, Dongdong ; Huang, Tiansheng ; Shi, Fang</creatorcontrib><description>Federated Learning (FL), as a privacy‐preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanilla random selection) typically ignore the heterogeneous data value of the clients. In this paper, we propose the contribution‐based selection algorithm (Contribution‐Based Exponential‐weight algorithm for Exploration and Exploitation, CBE3), which dynamically updates the selection weights according to the impact of clients' data. As a novel component of CBE3, a scaling factor, which helps maintain a good balance between global model accuracy and convergence speed, is proposed to improve the algorithm's adaptability. Theoretically, we proved the regret bound of the proposed CBE3 algorithm, which demonstrates performance gaps between the CBE3 and the optimal choice. Empirically, extensive experiments conducted on Non‐Independent Identically Distributed data demonstrate the superior performance of CBE3—with up to 10% accuracy improvement compared with K‐Center and Greedy and up to 100% faster convergence compared with the Random algorithm.</description><identifier>ISSN: 0884-8173</identifier><identifier>EISSN: 1098-111X</identifier><identifier>DOI: 10.1002/int.22879</identifier><language>eng</language><publisher>New York: Hindawi Limited</publisher><subject>Algorithms ; client ; Clients ; contribution‐based, EXP3 ; Convergence ; Decision theory ; Federated Learning ; global models ; Intelligent systems ; Machine learning ; Model accuracy ; Non‐Independent Identically Distributed ; Scaling factors</subject><ispartof>International journal of intelligent systems, 2022-10, Vol.37 (10), p.7235-7260</ispartof><rights>2022 Wiley Periodicals LLC.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3329-30c588009699069d5e017836c2238de0849445907c8a7cc0bc42ad6185220eb53</citedby><cites>FETCH-LOGICAL-c3329-30c588009699069d5e017836c2238de0849445907c8a7cc0bc42ad6185220eb53</cites><orcidid>0000-0001-6876-1795</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1002%2Fint.22879$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1002%2Fint.22879$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Lin, Weiwei</creatorcontrib><creatorcontrib>Xu, Yinhai</creatorcontrib><creatorcontrib>Liu, Bo</creatorcontrib><creatorcontrib>Li, Dongdong</creatorcontrib><creatorcontrib>Huang, Tiansheng</creatorcontrib><creatorcontrib>Shi, Fang</creatorcontrib><title>Contribution‐based Federated Learning client selection</title><title>International journal of intelligent systems</title><description>Federated Learning (FL), as a privacy‐preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanilla random selection) typically ignore the heterogeneous data value of the clients. In this paper, we propose the contribution‐based selection algorithm (Contribution‐Based Exponential‐weight algorithm for Exploration and Exploitation, CBE3), which dynamically updates the selection weights according to the impact of clients' data. As a novel component of CBE3, a scaling factor, which helps maintain a good balance between global model accuracy and convergence speed, is proposed to improve the algorithm's adaptability. Theoretically, we proved the regret bound of the proposed CBE3 algorithm, which demonstrates performance gaps between the CBE3 and the optimal choice. Empirically, extensive experiments conducted on Non‐Independent Identically Distributed data demonstrate the superior performance of CBE3—with up to 10% accuracy improvement compared with K‐Center and Greedy and up to 100% faster convergence compared with the Random algorithm.</description><subject>Algorithms</subject><subject>client</subject><subject>Clients</subject><subject>contribution‐based, EXP3</subject><subject>Convergence</subject><subject>Decision theory</subject><subject>Federated Learning</subject><subject>global models</subject><subject>Intelligent systems</subject><subject>Machine learning</subject><subject>Model accuracy</subject><subject>Non‐Independent Identically Distributed</subject><subject>Scaling factors</subject><issn>0884-8173</issn><issn>1098-111X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp10LFOwzAQBmALgUQpDLxBJSaGtGc7ic8jqiggVbAUic1ynCtyFZxip0LdeASekSchJaxMd8P330k_Y5ccphxAzHzopkKg0kdsxEFjxjl_OWYjQMwz5EqesrOUNgCcq7wYMZy3oYu-2nW-Dd-fX5VNVE8WVFO0Xb8tycbgw-vENZ5CN0nUkDvYc3aytk2ii785Zs-L29X8Pls-3T3Mb5aZk1LoTIIrEAF0qTWUui4IuEJZOiEk1gSY6zwvNCiHVjkHlcuFrUuOhRBAVSHH7Gq4u43t-45SZzbtLob-pREKSq6UgLJX14NysU0p0tpso3-zcW84mEMxpi_G_BbT29lgP3xD-_-heXhcDYkfPXtjnw</recordid><startdate>202210</startdate><enddate>202210</enddate><creator>Lin, Weiwei</creator><creator>Xu, Yinhai</creator><creator>Liu, Bo</creator><creator>Li, Dongdong</creator><creator>Huang, Tiansheng</creator><creator>Shi, Fang</creator><general>Hindawi Limited</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-6876-1795</orcidid></search><sort><creationdate>202210</creationdate><title>Contribution‐based Federated Learning client selection</title><author>Lin, Weiwei ; Xu, Yinhai ; Liu, Bo ; Li, Dongdong ; Huang, Tiansheng ; Shi, Fang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3329-30c588009699069d5e017836c2238de0849445907c8a7cc0bc42ad6185220eb53</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>client</topic><topic>Clients</topic><topic>contribution‐based, EXP3</topic><topic>Convergence</topic><topic>Decision theory</topic><topic>Federated Learning</topic><topic>global models</topic><topic>Intelligent systems</topic><topic>Machine learning</topic><topic>Model accuracy</topic><topic>Non‐Independent Identically Distributed</topic><topic>Scaling factors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lin, Weiwei</creatorcontrib><creatorcontrib>Xu, Yinhai</creatorcontrib><creatorcontrib>Liu, Bo</creatorcontrib><creatorcontrib>Li, Dongdong</creatorcontrib><creatorcontrib>Huang, Tiansheng</creatorcontrib><creatorcontrib>Shi, Fang</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>International journal of intelligent systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lin, Weiwei</au><au>Xu, Yinhai</au><au>Liu, Bo</au><au>Li, Dongdong</au><au>Huang, Tiansheng</au><au>Shi, Fang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Contribution‐based Federated Learning client selection</atitle><jtitle>International journal of intelligent systems</jtitle><date>2022-10</date><risdate>2022</risdate><volume>37</volume><issue>10</issue><spage>7235</spage><epage>7260</epage><pages>7235-7260</pages><issn>0884-8173</issn><eissn>1098-111X</eissn><abstract>Federated Learning (FL), as a privacy‐preserving machine learning paradigm, has been thrusted into the limelight. As a result of the physical bandwidth constraint, only a small number of clients are selected for each round of FL training. However, existing client selection solutions (e.g., the vanilla random selection) typically ignore the heterogeneous data value of the clients. In this paper, we propose the contribution‐based selection algorithm (Contribution‐Based Exponential‐weight algorithm for Exploration and Exploitation, CBE3), which dynamically updates the selection weights according to the impact of clients' data. As a novel component of CBE3, a scaling factor, which helps maintain a good balance between global model accuracy and convergence speed, is proposed to improve the algorithm's adaptability. Theoretically, we proved the regret bound of the proposed CBE3 algorithm, which demonstrates performance gaps between the CBE3 and the optimal choice. Empirically, extensive experiments conducted on Non‐Independent Identically Distributed data demonstrate the superior performance of CBE3—with up to 10% accuracy improvement compared with K‐Center and Greedy and up to 100% faster convergence compared with the Random algorithm.</abstract><cop>New York</cop><pub>Hindawi Limited</pub><doi>10.1002/int.22879</doi><tpages>26</tpages><orcidid>https://orcid.org/0000-0001-6876-1795</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0884-8173
ispartof International journal of intelligent systems, 2022-10, Vol.37 (10), p.7235-7260
issn 0884-8173
1098-111X
language eng
recordid cdi_proquest_journals_2706177206
source Wiley Online Library Journals Frontfile Complete
subjects Algorithms
client
Clients
contribution‐based, EXP3
Convergence
Decision theory
Federated Learning
global models
Intelligent systems
Machine learning
Model accuracy
Non‐Independent Identically Distributed
Scaling factors
title Contribution‐based Federated Learning client selection
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T19%3A17%3A01IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Contribution%E2%80%90based%20Federated%20Learning%20client%20selection&rft.jtitle=International%20journal%20of%20intelligent%20systems&rft.au=Lin,%20Weiwei&rft.date=2022-10&rft.volume=37&rft.issue=10&rft.spage=7235&rft.epage=7260&rft.pages=7235-7260&rft.issn=0884-8173&rft.eissn=1098-111X&rft_id=info:doi/10.1002/int.22879&rft_dat=%3Cproquest_cross%3E2706177206%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2706177206&rft_id=info:pmid/&rfr_iscdi=true