Client-Side Optimization Strategies for Communication-Efficient Federated Learning

Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applicati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE communications magazine 2022-07, Vol.60 (7), p.60-66
Hauptverfasser: Mills, Jed, Hu, Jia, Min, Geyong
Format: Magazinearticle
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 66
container_issue 7
container_start_page 60
container_title IEEE communications magazine
container_volume 60
creator Mills, Jed
Hu, Jia
Min, Geyong
description Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications with sensitive data such as healthcare, finance, and social media. However, there are barriers to real-world FL at the wireless network edge, stemming from massive wireless parallelism and the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous distribution of data across clients, and some cutting-edge works attempt to address this problem using novel client-side optimization strategies. In this article, we provide a tutorial on model training in FL, and survey the recent developments in client-side optimization and how they relate to the communication properties of FL. We then perform a set of comparison experiments on a representative subset of these strategies, gaining insights into their communication-convergence trade-offs. Finally, we highlight challenges to client-side optimization and provide suggestions for future developments in FL at the wireless edge.
doi_str_mv 10.1109/MCOM.005.210108
format Magazinearticle
fullrecord <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_MCOM_005_210108</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>9779646</ieee_id><sourcerecordid>2689807637</sourcerecordid><originalsourceid>FETCH-LOGICAL-c330t-bcd088180afd73d2f13b12f8f786b94aea759ea890ebbbfa04abd817b16fd7c63</originalsourceid><addsrcrecordid>eNo9kD1PwzAQQC0EEqUwM7BEYk57jhN_jChqAalVJAqzZSfnylWTFCcd4NeTEMR0w713Jz1C7iksKAW13ObFdgGQLRIKFOQFmdEskzGVil-SGVDOYi4hvSY3XXcAACGknJG3_Oix6eOdrzAqTr2v_bfpfdtEuz6YHvceu8i1Icrbuj43vvxdxivnfDmK0RorHMEq2qAJjW_2t-TKmWOHd39zTj7Wq_f8Jd4Uz6_50yYuGYM-tmUFUlIJxlWCVYmjzNLESScktyo1aESm0EgFaK11BlJjK0mFpXwQSs7m5HG6ewrt5xm7Xh_ac2iGlzrhUkkQnImBWk5UGdquC-j0KfjahC9NQY_h9BhOD-H0FG4wHibDI-I_rYRQPOXsB70ZauM</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>magazinearticle</recordtype><pqid>2689807637</pqid></control><display><type>magazinearticle</type><title>Client-Side Optimization Strategies for Communication-Efficient Federated Learning</title><source>IEEE Electronic Library (IEL)</source><creator>Mills, Jed ; Hu, Jia ; Min, Geyong</creator><creatorcontrib>Mills, Jed ; Hu, Jia ; Min, Geyong</creatorcontrib><description>Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications with sensitive data such as healthcare, finance, and social media. However, there are barriers to real-world FL at the wireless network edge, stemming from massive wireless parallelism and the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous distribution of data across clients, and some cutting-edge works attempt to address this problem using novel client-side optimization strategies. In this article, we provide a tutorial on model training in FL, and survey the recent developments in client-side optimization and how they relate to the communication properties of FL. We then perform a set of comparison experiments on a representative subset of these strategies, gaining insights into their communication-convergence trade-offs. Finally, we highlight challenges to client-side optimization and provide suggestions for future developments in FL at the wireless edge.</description><identifier>ISSN: 0163-6804</identifier><identifier>EISSN: 1558-1896</identifier><identifier>DOI: 10.1109/MCOM.005.210108</identifier><identifier>CODEN: ICOMD9</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Collaborative work ; Communication ; Computational modeling ; Convergence ; Data models ; Electronic devices ; Federated learning ; Machine learning ; Optimization ; Privacy ; Servers ; Wireless communication ; Wireless communications ; Wireless networks</subject><ispartof>IEEE communications magazine, 2022-07, Vol.60 (7), p.60-66</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c330t-bcd088180afd73d2f13b12f8f786b94aea759ea890ebbbfa04abd817b16fd7c63</citedby><cites>FETCH-LOGICAL-c330t-bcd088180afd73d2f13b12f8f786b94aea759ea890ebbbfa04abd817b16fd7c63</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/9779646$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>776,780,792,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/9779646$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Mills, Jed</creatorcontrib><creatorcontrib>Hu, Jia</creatorcontrib><creatorcontrib>Min, Geyong</creatorcontrib><title>Client-Side Optimization Strategies for Communication-Efficient Federated Learning</title><title>IEEE communications magazine</title><addtitle>COM-M</addtitle><description>Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications with sensitive data such as healthcare, finance, and social media. However, there are barriers to real-world FL at the wireless network edge, stemming from massive wireless parallelism and the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous distribution of data across clients, and some cutting-edge works attempt to address this problem using novel client-side optimization strategies. In this article, we provide a tutorial on model training in FL, and survey the recent developments in client-side optimization and how they relate to the communication properties of FL. We then perform a set of comparison experiments on a representative subset of these strategies, gaining insights into their communication-convergence trade-offs. Finally, we highlight challenges to client-side optimization and provide suggestions for future developments in FL at the wireless edge.</description><subject>Collaborative work</subject><subject>Communication</subject><subject>Computational modeling</subject><subject>Convergence</subject><subject>Data models</subject><subject>Electronic devices</subject><subject>Federated learning</subject><subject>Machine learning</subject><subject>Optimization</subject><subject>Privacy</subject><subject>Servers</subject><subject>Wireless communication</subject><subject>Wireless communications</subject><subject>Wireless networks</subject><issn>0163-6804</issn><issn>1558-1896</issn><fulltext>true</fulltext><rsrctype>magazinearticle</rsrctype><creationdate>2022</creationdate><recordtype>magazinearticle</recordtype><sourceid>RIE</sourceid><recordid>eNo9kD1PwzAQQC0EEqUwM7BEYk57jhN_jChqAalVJAqzZSfnylWTFCcd4NeTEMR0w713Jz1C7iksKAW13ObFdgGQLRIKFOQFmdEskzGVil-SGVDOYi4hvSY3XXcAACGknJG3_Oix6eOdrzAqTr2v_bfpfdtEuz6YHvceu8i1Icrbuj43vvxdxivnfDmK0RorHMEq2qAJjW_2t-TKmWOHd39zTj7Wq_f8Jd4Uz6_50yYuGYM-tmUFUlIJxlWCVYmjzNLESScktyo1aESm0EgFaK11BlJjK0mFpXwQSs7m5HG6ewrt5xm7Xh_ac2iGlzrhUkkQnImBWk5UGdquC-j0KfjahC9NQY_h9BhOD-H0FG4wHibDI-I_rYRQPOXsB70ZauM</recordid><startdate>20220701</startdate><enddate>20220701</enddate><creator>Mills, Jed</creator><creator>Hu, Jia</creator><creator>Min, Geyong</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>L7M</scope></search><sort><creationdate>20220701</creationdate><title>Client-Side Optimization Strategies for Communication-Efficient Federated Learning</title><author>Mills, Jed ; Hu, Jia ; Min, Geyong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c330t-bcd088180afd73d2f13b12f8f786b94aea759ea890ebbbfa04abd817b16fd7c63</frbrgroupid><rsrctype>magazinearticle</rsrctype><prefilter>magazinearticle</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Collaborative work</topic><topic>Communication</topic><topic>Computational modeling</topic><topic>Convergence</topic><topic>Data models</topic><topic>Electronic devices</topic><topic>Federated learning</topic><topic>Machine learning</topic><topic>Optimization</topic><topic>Privacy</topic><topic>Servers</topic><topic>Wireless communication</topic><topic>Wireless communications</topic><topic>Wireless networks</topic><toplevel>online_resources</toplevel><creatorcontrib>Mills, Jed</creatorcontrib><creatorcontrib>Hu, Jia</creatorcontrib><creatorcontrib>Min, Geyong</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE communications magazine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Mills, Jed</au><au>Hu, Jia</au><au>Min, Geyong</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Client-Side Optimization Strategies for Communication-Efficient Federated Learning</atitle><jtitle>IEEE communications magazine</jtitle><stitle>COM-M</stitle><date>2022-07-01</date><risdate>2022</risdate><volume>60</volume><issue>7</issue><spage>60</spage><epage>66</epage><pages>60-66</pages><issn>0163-6804</issn><eissn>1558-1896</eissn><coden>ICOMD9</coden><abstract>Federated learning (FL) is a swiftly evolving field within machine learning for collaboratively training models at the network edge in a privacy-preserving fashion, without training data leaving the devices where it was generated. The privacy-preserving nature of FL shows great promise for applications with sensitive data such as healthcare, finance, and social media. However, there are barriers to real-world FL at the wireless network edge, stemming from massive wireless parallelism and the high communication costs of model transmission. The communication cost of FL is heavily impacted by the heterogeneous distribution of data across clients, and some cutting-edge works attempt to address this problem using novel client-side optimization strategies. In this article, we provide a tutorial on model training in FL, and survey the recent developments in client-side optimization and how they relate to the communication properties of FL. We then perform a set of comparison experiments on a representative subset of these strategies, gaining insights into their communication-convergence trade-offs. Finally, we highlight challenges to client-side optimization and provide suggestions for future developments in FL at the wireless edge.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/MCOM.005.210108</doi><tpages>7</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0163-6804
ispartof IEEE communications magazine, 2022-07, Vol.60 (7), p.60-66
issn 0163-6804
1558-1896
language eng
recordid cdi_crossref_primary_10_1109_MCOM_005_210108
source IEEE Electronic Library (IEL)
subjects Collaborative work
Communication
Computational modeling
Convergence
Data models
Electronic devices
Federated learning
Machine learning
Optimization
Privacy
Servers
Wireless communication
Wireless communications
Wireless networks
title Client-Side Optimization Strategies for Communication-Efficient Federated Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-09T02%3A16%3A38IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Client-Side%20Optimization%20Strategies%20for%20Communication-Efficient%20Federated%20Learning&rft.jtitle=IEEE%20communications%20magazine&rft.au=Mills,%20Jed&rft.date=2022-07-01&rft.volume=60&rft.issue=7&rft.spage=60&rft.epage=66&rft.pages=60-66&rft.issn=0163-6804&rft.eissn=1558-1896&rft.coden=ICOMD9&rft_id=info:doi/10.1109/MCOM.005.210108&rft_dat=%3Cproquest_RIE%3E2689807637%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2689807637&rft_id=info:pmid/&rft_ieee_id=9779646&rfr_iscdi=true