Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning tech...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE network 2024-07, Vol.38 (4), p.151-157
Hauptverfasser: Wang, Xiaodong, Guan, Zhitao, Wu, Longfei, Gai, Keke
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 157
container_issue 4
container_start_page 151
container_title IEEE network
container_volume 38
creator Wang, Xiaodong
Guan, Zhitao
Wu, Longfei
Gai, Keke
description Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.
doi_str_mv 10.1109/MNET.2024.3369406
format Article
fullrecord <record><control><sourceid>crossref_RIE</sourceid><recordid>TN_cdi_crossref_primary_10_1109_MNET_2024_3369406</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10443954</ieee_id><sourcerecordid>10_1109_MNET_2024_3369406</sourcerecordid><originalsourceid>FETCH-LOGICAL-c148t-83c8d4fde422f79fe818485f79a79597d116a94ebc6a90aa59479ef01428ad0a3</originalsourceid><addsrcrecordid>eNpNkFFLwzAUhYMoOKc_QPAhf6AzaW7axDfpNh1MBZ3gW0mb2xmprSQtw39vu-3Bp3Mv954D5yPkmrMZ50zfPj0vNrOYxTATItHAkhMy4VKqiMvk45RMmNIsUgzgnFyE8MUYByniCSk27c54S9-w7D1S01j62hZ96OgSLXrToaVzFzpX16ZzbUNds9-9K_rxltVtb-9o9mnqGpsthn3EHIPbNnQVQo_hkpxVpg54ddQpeV8uNtljtH55WGX366jkoLpIiVJZqCxCHFeprlBxBUoOo0m11KnlPDEasCgHYcZIDanGaigSK2OZEVPCD7mlb0PwWOU_3n0b_5tzlo-Q8hFSPkLKj5AGz83B4xDx3z-A0BLEH2IxZEs</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues</title><source>IEEE Electronic Library (IEL)</source><creator>Wang, Xiaodong ; Guan, Zhitao ; Wu, Longfei ; Gai, Keke</creator><creatorcontrib>Wang, Xiaodong ; Guan, Zhitao ; Wu, Longfei ; Gai, Keke</creatorcontrib><description>Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.</description><identifier>ISSN: 0890-8044</identifier><identifier>EISSN: 1558-156X</identifier><identifier>DOI: 10.1109/MNET.2024.3369406</identifier><identifier>CODEN: IENEET</identifier><language>eng</language><publisher>IEEE</publisher><subject>Blockchains ; Cloud computing ; Computational modeling ; Data models ; Distributed computing ; Federated learning ; Security ; Servers ; Training</subject><ispartof>IEEE network, 2024-07, Vol.38 (4), p.151-157</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c148t-83c8d4fde422f79fe818485f79a79597d116a94ebc6a90aa59479ef01428ad0a3</cites><orcidid>0000-0003-0901-8621 ; 0000-0003-2325-5009 ; 0000-0002-3733-1730 ; 0000-0001-6784-0221</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10443954$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,792,27901,27902,54733</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10443954$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Wang, Xiaodong</creatorcontrib><creatorcontrib>Guan, Zhitao</creatorcontrib><creatorcontrib>Wu, Longfei</creatorcontrib><creatorcontrib>Gai, Keke</creatorcontrib><title>Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues</title><title>IEEE network</title><addtitle>NET-M</addtitle><description>Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.</description><subject>Blockchains</subject><subject>Cloud computing</subject><subject>Computational modeling</subject><subject>Data models</subject><subject>Distributed computing</subject><subject>Federated learning</subject><subject>Security</subject><subject>Servers</subject><subject>Training</subject><issn>0890-8044</issn><issn>1558-156X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>RIE</sourceid><recordid>eNpNkFFLwzAUhYMoOKc_QPAhf6AzaW7axDfpNh1MBZ3gW0mb2xmprSQtw39vu-3Bp3Mv954D5yPkmrMZ50zfPj0vNrOYxTATItHAkhMy4VKqiMvk45RMmNIsUgzgnFyE8MUYByniCSk27c54S9-w7D1S01j62hZ96OgSLXrToaVzFzpX16ZzbUNds9-9K_rxltVtb-9o9mnqGpsthn3EHIPbNnQVQo_hkpxVpg54ddQpeV8uNtljtH55WGX366jkoLpIiVJZqCxCHFeprlBxBUoOo0m11KnlPDEasCgHYcZIDanGaigSK2OZEVPCD7mlb0PwWOU_3n0b_5tzlo-Q8hFSPkLKj5AGz83B4xDx3z-A0BLEH2IxZEs</recordid><startdate>202407</startdate><enddate>202407</enddate><creator>Wang, Xiaodong</creator><creator>Guan, Zhitao</creator><creator>Wu, Longfei</creator><creator>Gai, Keke</creator><general>IEEE</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0003-0901-8621</orcidid><orcidid>https://orcid.org/0000-0003-2325-5009</orcidid><orcidid>https://orcid.org/0000-0002-3733-1730</orcidid><orcidid>https://orcid.org/0000-0001-6784-0221</orcidid></search><sort><creationdate>202407</creationdate><title>Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues</title><author>Wang, Xiaodong ; Guan, Zhitao ; Wu, Longfei ; Gai, Keke</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c148t-83c8d4fde422f79fe818485f79a79597d116a94ebc6a90aa59479ef01428ad0a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Blockchains</topic><topic>Cloud computing</topic><topic>Computational modeling</topic><topic>Data models</topic><topic>Distributed computing</topic><topic>Federated learning</topic><topic>Security</topic><topic>Servers</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wang, Xiaodong</creatorcontrib><creatorcontrib>Guan, Zhitao</creatorcontrib><creatorcontrib>Wu, Longfei</creatorcontrib><creatorcontrib>Gai, Keke</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>CrossRef</collection><jtitle>IEEE network</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Wang, Xiaodong</au><au>Guan, Zhitao</au><au>Wu, Longfei</au><au>Gai, Keke</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues</atitle><jtitle>IEEE network</jtitle><stitle>NET-M</stitle><date>2024-07</date><risdate>2024</risdate><volume>38</volume><issue>4</issue><spage>151</spage><epage>157</epage><pages>151-157</pages><issn>0890-8044</issn><eissn>1558-156X</eissn><coden>IENEET</coden><abstract>Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.</abstract><pub>IEEE</pub><doi>10.1109/MNET.2024.3369406</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0003-0901-8621</orcidid><orcidid>https://orcid.org/0000-0003-2325-5009</orcidid><orcidid>https://orcid.org/0000-0002-3733-1730</orcidid><orcidid>https://orcid.org/0000-0001-6784-0221</orcidid></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 0890-8044
ispartof IEEE network, 2024-07, Vol.38 (4), p.151-157
issn 0890-8044
1558-156X
language eng
recordid cdi_crossref_primary_10_1109_MNET_2024_3369406
source IEEE Electronic Library (IEL)
subjects Blockchains
Cloud computing
Computational modeling
Data models
Distributed computing
Federated learning
Security
Servers
Training
title Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T12%3A39%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_RIE&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Toward%20Secure%20and%20Robust%20Federated%20Distillation%20in%20Distributed%20Cloud:%20Challenges%20and%20Design%20Issues&rft.jtitle=IEEE%20network&rft.au=Wang,%20Xiaodong&rft.date=2024-07&rft.volume=38&rft.issue=4&rft.spage=151&rft.epage=157&rft.pages=151-157&rft.issn=0890-8044&rft.eissn=1558-156X&rft.coden=IENEET&rft_id=info:doi/10.1109/MNET.2024.3369406&rft_dat=%3Ccrossref_RIE%3E10_1109_MNET_2024_3369406%3C/crossref_RIE%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10443954&rfr_iscdi=true