Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning

Federated Learning is a training framework that enables multiple participants to collaboratively train a shared model while preserving data privacy. The heterogeneity of devices and networking resources of the participants delay the training and aggregation. The paper introduces a novel approach to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on parallel and distributed systems 2024-07, Vol.35 (7), p.1207-1220
Hauptverfasser: Mishra, Rahul, Gupta, Hari Prabhat, Banga, Garvit, Das, Sajal K.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1220
container_issue 7
container_start_page 1207
container_title IEEE transactions on parallel and distributed systems
container_volume 35
creator Mishra, Rahul
Gupta, Hari Prabhat
Banga, Garvit
Das, Sajal K.
description Federated Learning is a training framework that enables multiple participants to collaboratively train a shared model while preserving data privacy. The heterogeneity of devices and networking resources of the participants delay the training and aggregation. The paper introduces a novel approach to federated learning by incorporating resource-aware clustering. This method addresses the challenges posed by the diverse devices and networking resources among participants. Unlike static clustering approaches, this paper proposes a dynamic method to determine the optimal number of clusters using Dunn Indices. It enables adaptability to the varying heterogeneity levels among participants, ensuring a responsive and customized approach to clustering. Next, the paper goes beyond empirical observations by providing a mathematical derivation of the communication rounds for convergence within each cluster. Further, the participant assignment mechanism adds a layer of sophistication and ensures that devices and networking resources are allocated optimally. Afterwards, we incorporate a leader-follower technique, particularly through knowledge distillation, which improves the performance of lightweight models within clusters. Finally, experiments are conducted to validate the approach and to compare it with state-of-the-art. The results demonstrated an accuracy improvement of over 3% compared to its closest competitor and a reduction in communication rounds of around 10%.
doi_str_mv 10.1109/TPDS.2024.3379933
format Article
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_proquest_journals_3056008030</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10476717</ieee_id><sourcerecordid>3056008030</sourcerecordid><originalsourceid>FETCH-LOGICAL-c294t-4f81b161b38591766c3cfc42730031063f1997445b238ffa3c5550f1d0a67e983</originalsourceid><addsrcrecordid>eNpNkE9LAzEQxYMoWKsfQPAQ8Lx1ZpPsbryV1VqhYKn1vKTppGytuzXZIv32ZqkHT_OH994wP8ZuEUaIoB-W86f3UQqpHAmRay3EGRugUkWSYiHOYw9SJTpFfcmuQtgCoFQgB8xOaJ0sxuUjX1BoD95SMv4xnni5O4SOfN1suGs9Xxr7ueuHKcVtu6GG6u7IW8fnxne1rfem6QKvGx4DyZuO1nxGxjfRc80unNkFuvmrQ_YxeV6W02T29vJajmeJTbXsEukKXGGGK1EojXmWWWGdlWkuAARCJhxqnUupVqkonDPCKqXA4RpMlpMuxJDdn3L3vv0-UOiqbfyoiScrASoDKEBAVOFJZX0bgidX7X39ZfyxQqh6llXPsupZVn8so-fu5KmJ6J9e5lmOufgFj7FuyQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3056008030</pqid></control><display><type>article</type><title>Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning</title><source>IEEE Electronic Library (IEL)</source><creator>Mishra, Rahul ; Gupta, Hari Prabhat ; Banga, Garvit ; Das, Sajal K.</creator><creatorcontrib>Mishra, Rahul ; Gupta, Hari Prabhat ; Banga, Garvit ; Das, Sajal K.</creatorcontrib><description>Federated Learning is a training framework that enables multiple participants to collaboratively train a shared model while preserving data privacy. The heterogeneity of devices and networking resources of the participants delay the training and aggregation. The paper introduces a novel approach to federated learning by incorporating resource-aware clustering. This method addresses the challenges posed by the diverse devices and networking resources among participants. Unlike static clustering approaches, this paper proposes a dynamic method to determine the optimal number of clusters using Dunn Indices. It enables adaptability to the varying heterogeneity levels among participants, ensuring a responsive and customized approach to clustering. Next, the paper goes beyond empirical observations by providing a mathematical derivation of the communication rounds for convergence within each cluster. Further, the participant assignment mechanism adds a layer of sophistication and ensures that devices and networking resources are allocated optimally. Afterwards, we incorporate a leader-follower technique, particularly through knowledge distillation, which improves the performance of lightweight models within clusters. Finally, experiments are conducted to validate the approach and to compare it with state-of-the-art. The results demonstrated an accuracy improvement of over 3% compared to its closest competitor and a reduction in communication rounds of around 10%.</description><identifier>ISSN: 1045-9219</identifier><identifier>EISSN: 1558-2183</identifier><identifier>DOI: 10.1109/TPDS.2024.3379933</identifier><identifier>CODEN: ITDSEO</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Adaptation models ; Clustering ; Clusters ; Computational modeling ; Distillation ; Federated learning ; Heterogeneity ; leader-follower technique ; Mathematical models ; Optimization ; Performance evaluation ; resource aware clustering ; Servers ; Training</subject><ispartof>IEEE transactions on parallel and distributed systems, 2024-07, Vol.35 (7), p.1207-1220</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c294t-4f81b161b38591766c3cfc42730031063f1997445b238ffa3c5550f1d0a67e983</citedby><cites>FETCH-LOGICAL-c294t-4f81b161b38591766c3cfc42730031063f1997445b238ffa3c5550f1d0a67e983</cites><orcidid>0000-0002-9471-0868 ; 0000-0003-3207-1340 ; 0000-0001-6473-7261 ; 0000-0002-0976-6737</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10476717$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10476717$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Mishra, Rahul</creatorcontrib><creatorcontrib>Gupta, Hari Prabhat</creatorcontrib><creatorcontrib>Banga, Garvit</creatorcontrib><creatorcontrib>Das, Sajal K.</creatorcontrib><title>Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning</title><title>IEEE transactions on parallel and distributed systems</title><addtitle>TPDS</addtitle><description>Federated Learning is a training framework that enables multiple participants to collaboratively train a shared model while preserving data privacy. The heterogeneity of devices and networking resources of the participants delay the training and aggregation. The paper introduces a novel approach to federated learning by incorporating resource-aware clustering. This method addresses the challenges posed by the diverse devices and networking resources among participants. Unlike static clustering approaches, this paper proposes a dynamic method to determine the optimal number of clusters using Dunn Indices. It enables adaptability to the varying heterogeneity levels among participants, ensuring a responsive and customized approach to clustering. Next, the paper goes beyond empirical observations by providing a mathematical derivation of the communication rounds for convergence within each cluster. Further, the participant assignment mechanism adds a layer of sophistication and ensures that devices and networking resources are allocated optimally. Afterwards, we incorporate a leader-follower technique, particularly through knowledge distillation, which improves the performance of lightweight models within clusters. Finally, experiments are conducted to validate the approach and to compare it with state-of-the-art. The results demonstrated an accuracy improvement of over 3% compared to its closest competitor and a reduction in communication rounds of around 10%.</description><subject>Adaptation models</subject><subject>Clustering</subject><subject>Clusters</subject><subject>Computational modeling</subject><subject>Distillation</subject><subject>Federated learning</subject><subject>Heterogeneity</subject><subject>leader-follower technique</subject><subject>Mathematical models</subject><subject>Optimization</subject><subject>Performance evaluation</subject><subject>resource aware clustering</subject><subject>Servers</subject><subject>Training</subject><issn>1045-9219</issn><issn>1558-2183</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkE9LAzEQxYMoWKsfQPAQ8Lx1ZpPsbryV1VqhYKn1vKTppGytuzXZIv32ZqkHT_OH994wP8ZuEUaIoB-W86f3UQqpHAmRay3EGRugUkWSYiHOYw9SJTpFfcmuQtgCoFQgB8xOaJ0sxuUjX1BoD95SMv4xnni5O4SOfN1suGs9Xxr7ueuHKcVtu6GG6u7IW8fnxne1rfem6QKvGx4DyZuO1nxGxjfRc80unNkFuvmrQ_YxeV6W02T29vJajmeJTbXsEukKXGGGK1EojXmWWWGdlWkuAARCJhxqnUupVqkonDPCKqXA4RpMlpMuxJDdn3L3vv0-UOiqbfyoiScrASoDKEBAVOFJZX0bgidX7X39ZfyxQqh6llXPsupZVn8so-fu5KmJ6J9e5lmOufgFj7FuyQ</recordid><startdate>20240701</startdate><enddate>20240701</enddate><creator>Mishra, Rahul</creator><creator>Gupta, Hari Prabhat</creator><creator>Banga, Garvit</creator><creator>Das, Sajal K.</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-9471-0868</orcidid><orcidid>https://orcid.org/0000-0003-3207-1340</orcidid><orcidid>https://orcid.org/0000-0001-6473-7261</orcidid><orcidid>https://orcid.org/0000-0002-0976-6737</orcidid></search><sort><creationdate>20240701</creationdate><title>Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning</title><author>Mishra, Rahul ; Gupta, Hari Prabhat ; Banga, Garvit ; Das, Sajal K.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c294t-4f81b161b38591766c3cfc42730031063f1997445b238ffa3c5550f1d0a67e983</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Adaptation models</topic><topic>Clustering</topic><topic>Clusters</topic><topic>Computational modeling</topic><topic>Distillation</topic><topic>Federated learning</topic><topic>Heterogeneity</topic><topic>leader-follower technique</topic><topic>Mathematical models</topic><topic>Optimization</topic><topic>Performance evaluation</topic><topic>resource aware clustering</topic><topic>Servers</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mishra, Rahul</creatorcontrib><creatorcontrib>Gupta, Hari Prabhat</creatorcontrib><creatorcontrib>Banga, Garvit</creatorcontrib><creatorcontrib>Das, Sajal K.</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on parallel and distributed systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Mishra, Rahul</au><au>Gupta, Hari Prabhat</au><au>Banga, Garvit</au><au>Das, Sajal K.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning</atitle><jtitle>IEEE transactions on parallel and distributed systems</jtitle><stitle>TPDS</stitle><date>2024-07-01</date><risdate>2024</risdate><volume>35</volume><issue>7</issue><spage>1207</spage><epage>1220</epage><pages>1207-1220</pages><issn>1045-9219</issn><eissn>1558-2183</eissn><coden>ITDSEO</coden><abstract>Federated Learning is a training framework that enables multiple participants to collaboratively train a shared model while preserving data privacy. The heterogeneity of devices and networking resources of the participants delay the training and aggregation. The paper introduces a novel approach to federated learning by incorporating resource-aware clustering. This method addresses the challenges posed by the diverse devices and networking resources among participants. Unlike static clustering approaches, this paper proposes a dynamic method to determine the optimal number of clusters using Dunn Indices. It enables adaptability to the varying heterogeneity levels among participants, ensuring a responsive and customized approach to clustering. Next, the paper goes beyond empirical observations by providing a mathematical derivation of the communication rounds for convergence within each cluster. Further, the participant assignment mechanism adds a layer of sophistication and ensures that devices and networking resources are allocated optimally. Afterwards, we incorporate a leader-follower technique, particularly through knowledge distillation, which improves the performance of lightweight models within clusters. Finally, experiments are conducted to validate the approach and to compare it with state-of-the-art. The results demonstrated an accuracy improvement of over 3% compared to its closest competitor and a reduction in communication rounds of around 10%.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TPDS.2024.3379933</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-9471-0868</orcidid><orcidid>https://orcid.org/0000-0003-3207-1340</orcidid><orcidid>https://orcid.org/0000-0001-6473-7261</orcidid><orcidid>https://orcid.org/0000-0002-0976-6737</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1045-9219
ispartof IEEE transactions on parallel and distributed systems, 2024-07, Vol.35 (7), p.1207-1220
issn 1045-9219
1558-2183
language eng
recordid cdi_proquest_journals_3056008030
source IEEE Electronic Library (IEL)
subjects Adaptation models
Clustering
Clusters
Computational modeling
Distillation
Federated learning
Heterogeneity
leader-follower technique
Mathematical models
Optimization
Performance evaluation
resource aware clustering
Servers
Training
title Fed-RAC: Resource-Aware Clustering for Tackling Heterogeneity of Participants in Federated Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T11%3A22%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fed-RAC:%20Resource-Aware%20Clustering%20for%20Tackling%20Heterogeneity%20of%20Participants%20in%20Federated%20Learning&rft.jtitle=IEEE%20transactions%20on%20parallel%20and%20distributed%20systems&rft.au=Mishra,%20Rahul&rft.date=2024-07-01&rft.volume=35&rft.issue=7&rft.spage=1207&rft.epage=1220&rft.pages=1207-1220&rft.issn=1045-9219&rft.eissn=1558-2183&rft.coden=ITDSEO&rft_id=info:doi/10.1109/TPDS.2024.3379933&rft_dat=%3Cproquest_RIE%3E3056008030%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3056008030&rft_id=info:pmid/&rft_ieee_id=10476717&rfr_iscdi=true