The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset

Federated Learning is an algorithm suited for training models on decentralized data, but the requirement of a central "server" node is a bottleneck. In this document, we first introduce the notion of Decentralized Federated Learning (DFL). We then perform various experiments on different s...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zhang, Zhuofan, Zhou, Mi, Niu, Kaicheng, Abdallah, Chaouki
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Zhang, Zhuofan
Zhou, Mi
Niu, Kaicheng
Abdallah, Chaouki
description Federated Learning is an algorithm suited for training models on decentralized data, but the requirement of a central "server" node is a bottleneck. In this document, we first introduce the notion of Decentralized Federated Learning (DFL). We then perform various experiments on different setups, such as changing model aggregation frequency, switching from independent and identically distributed (IID) dataset partitioning to non-IID partitioning with partial global sharing, using different optimization methods across clients, and breaking models into segments with partial sharing. All experiments are run on the MNIST handwritten digits dataset. We observe that those altered training procedures are generally robust, albeit non-optimal. We also observe failures in training when the variance between model weights is too large. The open-source experiment code is accessible through GitHub\footnote{Code was uploaded at \url{https://github.com/zhzhang2018/DecentralizedFL}}.
doi_str_mv 10.48550/arxiv.2108.03508
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2108_03508</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2108_03508</sourcerecordid><originalsourceid>FETCH-LOGICAL-a678-1470e3aa332712a10579827ac28729a09c47405db95c85de0548db0af427d4d03</originalsourceid><addsrcrecordid>eNotj8tOwzAQRb1hgQofwAr_QMLED-wsUR9QqQUkso8m9oRaNC5yLAR8PSGwug_dGekwdlVBqazWcIPpM3yUogJbgtRgz9lbcyC-7ntymZ963iQMMcRX_owJB8qURo7R8z25A8YwDiM_Rb4iRzEnPIZv8nxDnhLmye0I03zd4TjFabl_3L40fIV5KvIFO-vxONLlvy5Ys1k3y4di93S_Xd7tCrw1tqiUAZKIUgpTCaxAm9oKg05YI2qE2imjQPuu1s5qT6CV9R1gr4TxyoNcsOu_tzNt-57CgOmr_aVuZ2r5A-cYUXk</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset</title><source>arXiv.org</source><creator>Zhang, Zhuofan ; Zhou, Mi ; Niu, Kaicheng ; Abdallah, Chaouki</creator><creatorcontrib>Zhang, Zhuofan ; Zhou, Mi ; Niu, Kaicheng ; Abdallah, Chaouki</creatorcontrib><description>Federated Learning is an algorithm suited for training models on decentralized data, but the requirement of a central "server" node is a bottleneck. In this document, we first introduce the notion of Decentralized Federated Learning (DFL). We then perform various experiments on different setups, such as changing model aggregation frequency, switching from independent and identically distributed (IID) dataset partitioning to non-IID partitioning with partial global sharing, using different optimization methods across clients, and breaking models into segments with partial sharing. All experiments are run on the MNIST handwritten digits dataset. We observe that those altered training procedures are generally robust, albeit non-optimal. We also observe failures in training when the variance between model weights is too large. The open-source experiment code is accessible through GitHub\footnote{Code was uploaded at \url{https://github.com/zhzhang2018/DecentralizedFL}}.</description><identifier>DOI: 10.48550/arxiv.2108.03508</identifier><language>eng</language><subject>Computer Science - Learning</subject><creationdate>2021-08</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2108.03508$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2108.03508$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhang, Zhuofan</creatorcontrib><creatorcontrib>Zhou, Mi</creatorcontrib><creatorcontrib>Niu, Kaicheng</creatorcontrib><creatorcontrib>Abdallah, Chaouki</creatorcontrib><title>The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset</title><description>Federated Learning is an algorithm suited for training models on decentralized data, but the requirement of a central "server" node is a bottleneck. In this document, we first introduce the notion of Decentralized Federated Learning (DFL). We then perform various experiments on different setups, such as changing model aggregation frequency, switching from independent and identically distributed (IID) dataset partitioning to non-IID partitioning with partial global sharing, using different optimization methods across clients, and breaking models into segments with partial sharing. All experiments are run on the MNIST handwritten digits dataset. We observe that those altered training procedures are generally robust, albeit non-optimal. We also observe failures in training when the variance between model weights is too large. The open-source experiment code is accessible through GitHub\footnote{Code was uploaded at \url{https://github.com/zhzhang2018/DecentralizedFL}}.</description><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tOwzAQRb1hgQofwAr_QMLED-wsUR9QqQUkso8m9oRaNC5yLAR8PSGwug_dGekwdlVBqazWcIPpM3yUogJbgtRgz9lbcyC-7ntymZ963iQMMcRX_owJB8qURo7R8z25A8YwDiM_Rb4iRzEnPIZv8nxDnhLmye0I03zd4TjFabl_3L40fIV5KvIFO-vxONLlvy5Ys1k3y4di93S_Xd7tCrw1tqiUAZKIUgpTCaxAm9oKg05YI2qE2imjQPuu1s5qT6CV9R1gr4TxyoNcsOu_tzNt-57CgOmr_aVuZ2r5A-cYUXk</recordid><startdate>20210807</startdate><enddate>20210807</enddate><creator>Zhang, Zhuofan</creator><creator>Zhou, Mi</creator><creator>Niu, Kaicheng</creator><creator>Abdallah, Chaouki</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20210807</creationdate><title>The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset</title><author>Zhang, Zhuofan ; Zhou, Mi ; Niu, Kaicheng ; Abdallah, Chaouki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a678-1470e3aa332712a10579827ac28729a09c47405db95c85de0548db0af427d4d03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Zhuofan</creatorcontrib><creatorcontrib>Zhou, Mi</creatorcontrib><creatorcontrib>Niu, Kaicheng</creatorcontrib><creatorcontrib>Abdallah, Chaouki</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhang, Zhuofan</au><au>Zhou, Mi</au><au>Niu, Kaicheng</au><au>Abdallah, Chaouki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset</atitle><date>2021-08-07</date><risdate>2021</risdate><abstract>Federated Learning is an algorithm suited for training models on decentralized data, but the requirement of a central "server" node is a bottleneck. In this document, we first introduce the notion of Decentralized Federated Learning (DFL). We then perform various experiments on different setups, such as changing model aggregation frequency, switching from independent and identically distributed (IID) dataset partitioning to non-IID partitioning with partial global sharing, using different optimization methods across clients, and breaking models into segments with partial sharing. All experiments are run on the MNIST handwritten digits dataset. We observe that those altered training procedures are generally robust, albeit non-optimal. We also observe failures in training when the variance between model weights is too large. The open-source experiment code is accessible through GitHub\footnote{Code was uploaded at \url{https://github.com/zhzhang2018/DecentralizedFL}}.</abstract><doi>10.48550/arxiv.2108.03508</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2108.03508
ispartof
issn
language eng
recordid cdi_arxiv_primary_2108_03508
source arXiv.org
subjects Computer Science - Learning
title The Effect of Training Parameters and Mechanisms on Decentralized Federated Learning based on MNIST Dataset
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-29T16%3A48%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Effect%20of%20Training%20Parameters%20and%20Mechanisms%20on%20Decentralized%20Federated%20Learning%20based%20on%20MNIST%20Dataset&rft.au=Zhang,%20Zhuofan&rft.date=2021-08-07&rft_id=info:doi/10.48550/arxiv.2108.03508&rft_dat=%3Carxiv_GOX%3E2108_03508%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true