Byzantine-Resilient Federated Learning At Edge
Both Byzantine resilience and communication efficiency have attracted tremendous attention recently for their significance in edge federated learning. However, most existing algorithms may fail when dealing with real-world irregular data that behaves in a heavy-tailed manner. To address this issue,...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on computers 2023-09, Vol.72 (9), p.1-14 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 14 |
---|---|
container_issue | 9 |
container_start_page | 1 |
container_title | IEEE transactions on computers |
container_volume | 72 |
creator | Tao, Youming Cui, Sijia Xu, Wenlu Yin, Haofei Yu, Dongxiao Liang, Weifa Cheng, Xiuzhen |
description | Both Byzantine resilience and communication efficiency have attracted tremendous attention recently for their significance in edge federated learning. However, most existing algorithms may fail when dealing with real-world irregular data that behaves in a heavy-tailed manner. To address this issue, we study the stochastic convex and non-convex optimization problem for federated learning at edge and show how to handle heavy-tailed data while retaining the Byzantine resilience, communication efficiency and the optimal statistical error rates simultaneously. Specifically, we first present a Byzantine-resilient distributed gradient descent algorithm that can handle the heavy-tailed data and meanwhile converge under the standard assumptions. To reduce the communication overhead, we further propose another algorithm that incorporates gradient compression techniques to save communication costs during the learning process. Theoretical analysis shows that our algorithms achieve order-optimal statistical error rate in presence of Byzantine devices. Finally, we conduct extensive experiments on both synthetic and real-world datasets to verify the efficacy of our algorithms. |
doi_str_mv | 10.1109/TC.2023.3257510 |
format | Article |
fullrecord | <record><control><sourceid>proquest_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TC_2023_3257510</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10070815</ieee_id><sourcerecordid>2847963145</sourcerecordid><originalsourceid>FETCH-LOGICAL-c290t-32d63a6027d775aecfb1035cd1a375a3aa5f0fd79b0c9f0308a2a460e380472a3</originalsourceid><addsrcrecordid>eNpNkEFLw0AQhRdRsFbPXjwEPCed3clms8caWhUKgtTzMs1OSkpN6256qL_elPbgaXjwvTfwCfEoIZMS7GRZZQoUZqi00RKuxEhqbVJrdXEtRgCyTC3mcCvuYtwAQKHAjkT2cvylrm87Tj85ttuWuz6Zs-dAPftkwRS6tlsn0z6Z-TXfi5uGtpEfLncsvuazZfWWLj5e36vpIq2VhT5F5QukApTxxmjiullJQF17SThkJNINNN7YFdS2AYSSFOUFMJaQG0U4Fs_n3X3Y_Rw49m6zO4RueOlUmRtboMz1QE3OVB12MQZu3D603xSOToI7SXHLyp2kuIuUofF0brTM_I8GA6XU-Ae1b1si</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2847963145</pqid></control><display><type>article</type><title>Byzantine-Resilient Federated Learning At Edge</title><source>IEEE Electronic Library (IEL)</source><creator>Tao, Youming ; Cui, Sijia ; Xu, Wenlu ; Yin, Haofei ; Yu, Dongxiao ; Liang, Weifa ; Cheng, Xiuzhen</creator><creatorcontrib>Tao, Youming ; Cui, Sijia ; Xu, Wenlu ; Yin, Haofei ; Yu, Dongxiao ; Liang, Weifa ; Cheng, Xiuzhen</creatorcontrib><description>Both Byzantine resilience and communication efficiency have attracted tremendous attention recently for their significance in edge federated learning. However, most existing algorithms may fail when dealing with real-world irregular data that behaves in a heavy-tailed manner. To address this issue, we study the stochastic convex and non-convex optimization problem for federated learning at edge and show how to handle heavy-tailed data while retaining the Byzantine resilience, communication efficiency and the optimal statistical error rates simultaneously. Specifically, we first present a Byzantine-resilient distributed gradient descent algorithm that can handle the heavy-tailed data and meanwhile converge under the standard assumptions. To reduce the communication overhead, we further propose another algorithm that incorporates gradient compression techniques to save communication costs during the learning process. Theoretical analysis shows that our algorithms achieve order-optimal statistical error rate in presence of Byzantine devices. Finally, we conduct extensive experiments on both synthetic and real-world datasets to verify the efficacy of our algorithms.</description><identifier>ISSN: 0018-9340</identifier><identifier>EISSN: 1557-9956</identifier><identifier>DOI: 10.1109/TC.2023.3257510</identifier><identifier>CODEN: ITCOB4</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Byzantine resilience ; Communication ; communication efficiency ; Computational geometry ; Convexity ; Cost analysis ; Distributed databases ; edge intelligent systems ; Error analysis ; Federated learning ; Heavily-tailed distribution ; Machine learning ; Optimization ; Resilience ; Servers ; Training</subject><ispartof>IEEE transactions on computers, 2023-09, Vol.72 (9), p.1-14</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c290t-32d63a6027d775aecfb1035cd1a375a3aa5f0fd79b0c9f0308a2a460e380472a3</citedby><cites>FETCH-LOGICAL-c290t-32d63a6027d775aecfb1035cd1a375a3aa5f0fd79b0c9f0308a2a460e380472a3</cites><orcidid>0000-0001-6750-1190 ; 0000-0002-8207-6740 ; 0000-0001-6835-5981 ; 0000-0001-5912-4647 ; 0009-0004-7304-9064</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10070815$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,796,27924,27925,54758</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10070815$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Tao, Youming</creatorcontrib><creatorcontrib>Cui, Sijia</creatorcontrib><creatorcontrib>Xu, Wenlu</creatorcontrib><creatorcontrib>Yin, Haofei</creatorcontrib><creatorcontrib>Yu, Dongxiao</creatorcontrib><creatorcontrib>Liang, Weifa</creatorcontrib><creatorcontrib>Cheng, Xiuzhen</creatorcontrib><title>Byzantine-Resilient Federated Learning At Edge</title><title>IEEE transactions on computers</title><addtitle>TC</addtitle><description>Both Byzantine resilience and communication efficiency have attracted tremendous attention recently for their significance in edge federated learning. However, most existing algorithms may fail when dealing with real-world irregular data that behaves in a heavy-tailed manner. To address this issue, we study the stochastic convex and non-convex optimization problem for federated learning at edge and show how to handle heavy-tailed data while retaining the Byzantine resilience, communication efficiency and the optimal statistical error rates simultaneously. Specifically, we first present a Byzantine-resilient distributed gradient descent algorithm that can handle the heavy-tailed data and meanwhile converge under the standard assumptions. To reduce the communication overhead, we further propose another algorithm that incorporates gradient compression techniques to save communication costs during the learning process. Theoretical analysis shows that our algorithms achieve order-optimal statistical error rate in presence of Byzantine devices. Finally, we conduct extensive experiments on both synthetic and real-world datasets to verify the efficacy of our algorithms.</description><subject>Algorithms</subject><subject>Byzantine resilience</subject><subject>Communication</subject><subject>communication efficiency</subject><subject>Computational geometry</subject><subject>Convexity</subject><subject>Cost analysis</subject><subject>Distributed databases</subject><subject>edge intelligent systems</subject><subject>Error analysis</subject><subject>Federated learning</subject><subject>Heavily-tailed distribution</subject><subject>Machine learning</subject><subject>Optimization</subject><subject>Resilience</subject><subject>Servers</subject><subject>Training</subject><issn>0018-9340</issn><issn>1557-9956</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkEFLw0AQhRdRsFbPXjwEPCed3clms8caWhUKgtTzMs1OSkpN6256qL_elPbgaXjwvTfwCfEoIZMS7GRZZQoUZqi00RKuxEhqbVJrdXEtRgCyTC3mcCvuYtwAQKHAjkT2cvylrm87Tj85ttuWuz6Zs-dAPftkwRS6tlsn0z6Z-TXfi5uGtpEfLncsvuazZfWWLj5e36vpIq2VhT5F5QukApTxxmjiullJQF17SThkJNINNN7YFdS2AYSSFOUFMJaQG0U4Fs_n3X3Y_Rw49m6zO4RueOlUmRtboMz1QE3OVB12MQZu3D603xSOToI7SXHLyp2kuIuUofF0brTM_I8GA6XU-Ae1b1si</recordid><startdate>20230901</startdate><enddate>20230901</enddate><creator>Tao, Youming</creator><creator>Cui, Sijia</creator><creator>Xu, Wenlu</creator><creator>Yin, Haofei</creator><creator>Yu, Dongxiao</creator><creator>Liang, Weifa</creator><creator>Cheng, Xiuzhen</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0001-6750-1190</orcidid><orcidid>https://orcid.org/0000-0002-8207-6740</orcidid><orcidid>https://orcid.org/0000-0001-6835-5981</orcidid><orcidid>https://orcid.org/0000-0001-5912-4647</orcidid><orcidid>https://orcid.org/0009-0004-7304-9064</orcidid></search><sort><creationdate>20230901</creationdate><title>Byzantine-Resilient Federated Learning At Edge</title><author>Tao, Youming ; Cui, Sijia ; Xu, Wenlu ; Yin, Haofei ; Yu, Dongxiao ; Liang, Weifa ; Cheng, Xiuzhen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c290t-32d63a6027d775aecfb1035cd1a375a3aa5f0fd79b0c9f0308a2a460e380472a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Byzantine resilience</topic><topic>Communication</topic><topic>communication efficiency</topic><topic>Computational geometry</topic><topic>Convexity</topic><topic>Cost analysis</topic><topic>Distributed databases</topic><topic>edge intelligent systems</topic><topic>Error analysis</topic><topic>Federated learning</topic><topic>Heavily-tailed distribution</topic><topic>Machine learning</topic><topic>Optimization</topic><topic>Resilience</topic><topic>Servers</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Tao, Youming</creatorcontrib><creatorcontrib>Cui, Sijia</creatorcontrib><creatorcontrib>Xu, Wenlu</creatorcontrib><creatorcontrib>Yin, Haofei</creatorcontrib><creatorcontrib>Yu, Dongxiao</creatorcontrib><creatorcontrib>Liang, Weifa</creatorcontrib><creatorcontrib>Cheng, Xiuzhen</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE transactions on computers</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Tao, Youming</au><au>Cui, Sijia</au><au>Xu, Wenlu</au><au>Yin, Haofei</au><au>Yu, Dongxiao</au><au>Liang, Weifa</au><au>Cheng, Xiuzhen</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Byzantine-Resilient Federated Learning At Edge</atitle><jtitle>IEEE transactions on computers</jtitle><stitle>TC</stitle><date>2023-09-01</date><risdate>2023</risdate><volume>72</volume><issue>9</issue><spage>1</spage><epage>14</epage><pages>1-14</pages><issn>0018-9340</issn><eissn>1557-9956</eissn><coden>ITCOB4</coden><abstract>Both Byzantine resilience and communication efficiency have attracted tremendous attention recently for their significance in edge federated learning. However, most existing algorithms may fail when dealing with real-world irregular data that behaves in a heavy-tailed manner. To address this issue, we study the stochastic convex and non-convex optimization problem for federated learning at edge and show how to handle heavy-tailed data while retaining the Byzantine resilience, communication efficiency and the optimal statistical error rates simultaneously. Specifically, we first present a Byzantine-resilient distributed gradient descent algorithm that can handle the heavy-tailed data and meanwhile converge under the standard assumptions. To reduce the communication overhead, we further propose another algorithm that incorporates gradient compression techniques to save communication costs during the learning process. Theoretical analysis shows that our algorithms achieve order-optimal statistical error rate in presence of Byzantine devices. Finally, we conduct extensive experiments on both synthetic and real-world datasets to verify the efficacy of our algorithms.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TC.2023.3257510</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0001-6750-1190</orcidid><orcidid>https://orcid.org/0000-0002-8207-6740</orcidid><orcidid>https://orcid.org/0000-0001-6835-5981</orcidid><orcidid>https://orcid.org/0000-0001-5912-4647</orcidid><orcidid>https://orcid.org/0009-0004-7304-9064</orcidid></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | ISSN: 0018-9340 |
ispartof | IEEE transactions on computers, 2023-09, Vol.72 (9), p.1-14 |
issn | 0018-9340 1557-9956 |
language | eng |
recordid | cdi_crossref_primary_10_1109_TC_2023_3257510 |
source | IEEE Electronic Library (IEL) |
subjects | Algorithms Byzantine resilience Communication communication efficiency Computational geometry Convexity Cost analysis Distributed databases edge intelligent systems Error analysis Federated learning Heavily-tailed distribution Machine learning Optimization Resilience Servers Training |
title | Byzantine-Resilient Federated Learning At Edge |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-02T12%3A14%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Byzantine-Resilient%20Federated%20Learning%20At%20Edge&rft.jtitle=IEEE%20transactions%20on%20computers&rft.au=Tao,%20Youming&rft.date=2023-09-01&rft.volume=72&rft.issue=9&rft.spage=1&rft.epage=14&rft.pages=1-14&rft.issn=0018-9340&rft.eissn=1557-9956&rft.coden=ITCOB4&rft_id=info:doi/10.1109/TC.2023.3257510&rft_dat=%3Cproquest_RIE%3E2847963145%3C/proquest_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2847963145&rft_id=info:pmid/&rft_ieee_id=10070815&rfr_iscdi=true |