Federated Learning of Neural ODE Models with Different Iteration Counts
Federated learning is a distributed machine learning approach in which clients train models locally with their own data and upload them to a server so that their trained results are shared between them without uploading raw data to the server. There are some challenges in federated learning, such as...
Gespeichert in:
Veröffentlicht in: | IEICE Transactions on Information and Systems 2024/06/01, Vol.E107.D(6), pp.781-791 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 791 |
---|---|
container_issue | 6 |
container_start_page | 781 |
container_title | IEICE Transactions on Information and Systems |
container_volume | E107.D |
creator | HOSHINO, Yuto KAWAKAMI, Hiroki MATSUTANI, Hiroki |
description | Federated learning is a distributed machine learning approach in which clients train models locally with their own data and upload them to a server so that their trained results are shared between them without uploading raw data to the server. There are some challenges in federated learning, such as communication size reduction and client heterogeneity. The former can mitigate the communication overheads, and the latter can allow the clients to choose proper models depending on their available compute resources. To address these challenges, in this paper, we utilize Neural ODE based models for federated learning. The proposed flexible federated learning approach can reduce the communication size while aggregating models with different iteration counts or depths. Our contribution is that we experimentally demonstrate that the proposed federated learning can aggregate models with different iteration counts or depths. It is compared with a different federated learning approach in terms of the accuracy. Furthermore, we show that our approach can reduce communication size by up to 89.4% compared with a baseline ResNet model using CIFAR-10 dataset. |
doi_str_mv | 10.1587/transinf.2023EDP7176 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3080101510</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3080101510</sourcerecordid><originalsourceid>FETCH-LOGICAL-c345t-3ae76b8d4dab2e1bef65ada5e49e0e1c294e336a9eb2f945c3339ab6400ddd703</originalsourceid><addsrcrecordid>eNpNkF1PwjAUhhujiYj-Ay-aeD1t149tl4YNJEExRq-bbj2FkdlhW2L890IQ5Oq8F8_znuRF6JaSeyry7CF67ULr7H1KUlaVrxnN5Bka0IyLhDJJz9GAFFQmuWDpJboKYUUIzVMqBmgyBgNeRzB4Btq71i1wb_ELbLzu8Lys8HNvoAv4u41LXLbWggcX8TTurLZ3eNRvXAzX6MLqLsDN3x2ij3H1PnpKZvPJdPQ4SxrGRUyYhkzWueFG1ynQGqwU2mgBvAACtEkLDoxJXUCd2oKLhjFW6FpyQowxGWFDdLfvXfv-awMhqlW_8W77UjGSE0qooDuK76nG9yF4sGrt20_tfxQlajeZOkymTibbam97bRWiXsBR0j62TQf_UkVJpkolD-Gk5Ag3S-0VOPYLhhV-ew</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3080101510</pqid></control><display><type>article</type><title>Federated Learning of Neural ODE Models with Different Iteration Counts</title><source>J-STAGE Free</source><source>EZB Electronic Journals Library</source><creator>HOSHINO, Yuto ; KAWAKAMI, Hiroki ; MATSUTANI, Hiroki</creator><creatorcontrib>HOSHINO, Yuto ; KAWAKAMI, Hiroki ; MATSUTANI, Hiroki</creatorcontrib><description>Federated learning is a distributed machine learning approach in which clients train models locally with their own data and upload them to a server so that their trained results are shared between them without uploading raw data to the server. There are some challenges in federated learning, such as communication size reduction and client heterogeneity. The former can mitigate the communication overheads, and the latter can allow the clients to choose proper models depending on their available compute resources. To address these challenges, in this paper, we utilize Neural ODE based models for federated learning. The proposed flexible federated learning approach can reduce the communication size while aggregating models with different iteration counts or depths. Our contribution is that we experimentally demonstrate that the proposed federated learning can aggregate models with different iteration counts or depths. It is compared with a different federated learning approach in terms of the accuracy. Furthermore, we show that our approach can reduce communication size by up to 89.4% compared with a baseline ResNet model using CIFAR-10 dataset.</description><identifier>ISSN: 0916-8532</identifier><identifier>EISSN: 1745-1361</identifier><identifier>DOI: 10.1587/transinf.2023EDP7176</identifier><language>eng</language><publisher>Tokyo: The Institute of Electronics, Information and Communication Engineers</publisher><subject>Clients ; Communication ; Federated learning ; Heterogeneity ; Machine learning ; neural networks ; neural ODE ; Servers ; Size reduction</subject><ispartof>IEICE Transactions on Information and Systems, 2024/06/01, Vol.E107.D(6), pp.781-791</ispartof><rights>2024 The Institute of Electronics, Information and Communication Engineers</rights><rights>Copyright Japan Science and Technology Agency 2024</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c345t-3ae76b8d4dab2e1bef65ada5e49e0e1c294e336a9eb2f945c3339ab6400ddd703</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,1883,27924,27925</link.rule.ids></links><search><creatorcontrib>HOSHINO, Yuto</creatorcontrib><creatorcontrib>KAWAKAMI, Hiroki</creatorcontrib><creatorcontrib>MATSUTANI, Hiroki</creatorcontrib><title>Federated Learning of Neural ODE Models with Different Iteration Counts</title><title>IEICE Transactions on Information and Systems</title><addtitle>IEICE Trans. Inf. & Syst.</addtitle><description>Federated learning is a distributed machine learning approach in which clients train models locally with their own data and upload them to a server so that their trained results are shared between them without uploading raw data to the server. There are some challenges in federated learning, such as communication size reduction and client heterogeneity. The former can mitigate the communication overheads, and the latter can allow the clients to choose proper models depending on their available compute resources. To address these challenges, in this paper, we utilize Neural ODE based models for federated learning. The proposed flexible federated learning approach can reduce the communication size while aggregating models with different iteration counts or depths. Our contribution is that we experimentally demonstrate that the proposed federated learning can aggregate models with different iteration counts or depths. It is compared with a different federated learning approach in terms of the accuracy. Furthermore, we show that our approach can reduce communication size by up to 89.4% compared with a baseline ResNet model using CIFAR-10 dataset.</description><subject>Clients</subject><subject>Communication</subject><subject>Federated learning</subject><subject>Heterogeneity</subject><subject>Machine learning</subject><subject>neural networks</subject><subject>neural ODE</subject><subject>Servers</subject><subject>Size reduction</subject><issn>0916-8532</issn><issn>1745-1361</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkF1PwjAUhhujiYj-Ay-aeD1t149tl4YNJEExRq-bbj2FkdlhW2L890IQ5Oq8F8_znuRF6JaSeyry7CF67ULr7H1KUlaVrxnN5Bka0IyLhDJJz9GAFFQmuWDpJboKYUUIzVMqBmgyBgNeRzB4Btq71i1wb_ELbLzu8Lys8HNvoAv4u41LXLbWggcX8TTurLZ3eNRvXAzX6MLqLsDN3x2ij3H1PnpKZvPJdPQ4SxrGRUyYhkzWueFG1ynQGqwU2mgBvAACtEkLDoxJXUCd2oKLhjFW6FpyQowxGWFDdLfvXfv-awMhqlW_8W77UjGSE0qooDuK76nG9yF4sGrt20_tfxQlajeZOkymTibbam97bRWiXsBR0j62TQf_UkVJpkolD-Gk5Ag3S-0VOPYLhhV-ew</recordid><startdate>20240601</startdate><enddate>20240601</enddate><creator>HOSHINO, Yuto</creator><creator>KAWAKAMI, Hiroki</creator><creator>MATSUTANI, Hiroki</creator><general>The Institute of Electronics, Information and Communication Engineers</general><general>Japan Science and Technology Agency</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope></search><sort><creationdate>20240601</creationdate><title>Federated Learning of Neural ODE Models with Different Iteration Counts</title><author>HOSHINO, Yuto ; KAWAKAMI, Hiroki ; MATSUTANI, Hiroki</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c345t-3ae76b8d4dab2e1bef65ada5e49e0e1c294e336a9eb2f945c3339ab6400ddd703</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Clients</topic><topic>Communication</topic><topic>Federated learning</topic><topic>Heterogeneity</topic><topic>Machine learning</topic><topic>neural networks</topic><topic>neural ODE</topic><topic>Servers</topic><topic>Size reduction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>HOSHINO, Yuto</creatorcontrib><creatorcontrib>KAWAKAMI, Hiroki</creatorcontrib><creatorcontrib>MATSUTANI, Hiroki</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEICE Transactions on Information and Systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>HOSHINO, Yuto</au><au>KAWAKAMI, Hiroki</au><au>MATSUTANI, Hiroki</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Federated Learning of Neural ODE Models with Different Iteration Counts</atitle><jtitle>IEICE Transactions on Information and Systems</jtitle><addtitle>IEICE Trans. Inf. & Syst.</addtitle><date>2024-06-01</date><risdate>2024</risdate><volume>E107.D</volume><issue>6</issue><spage>781</spage><epage>791</epage><pages>781-791</pages><artnum>2023EDP7176</artnum><issn>0916-8532</issn><eissn>1745-1361</eissn><abstract>Federated learning is a distributed machine learning approach in which clients train models locally with their own data and upload them to a server so that their trained results are shared between them without uploading raw data to the server. There are some challenges in federated learning, such as communication size reduction and client heterogeneity. The former can mitigate the communication overheads, and the latter can allow the clients to choose proper models depending on their available compute resources. To address these challenges, in this paper, we utilize Neural ODE based models for federated learning. The proposed flexible federated learning approach can reduce the communication size while aggregating models with different iteration counts or depths. Our contribution is that we experimentally demonstrate that the proposed federated learning can aggregate models with different iteration counts or depths. It is compared with a different federated learning approach in terms of the accuracy. Furthermore, we show that our approach can reduce communication size by up to 89.4% compared with a baseline ResNet model using CIFAR-10 dataset.</abstract><cop>Tokyo</cop><pub>The Institute of Electronics, Information and Communication Engineers</pub><doi>10.1587/transinf.2023EDP7176</doi><tpages>11</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0916-8532 |
ispartof | IEICE Transactions on Information and Systems, 2024/06/01, Vol.E107.D(6), pp.781-791 |
issn | 0916-8532 1745-1361 |
language | eng |
recordid | cdi_proquest_journals_3080101510 |
source | J-STAGE Free; EZB Electronic Journals Library |
subjects | Clients Communication Federated learning Heterogeneity Machine learning neural networks neural ODE Servers Size reduction |
title | Federated Learning of Neural ODE Models with Different Iteration Counts |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-03T13%3A59%3A36IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Federated%20Learning%20of%20Neural%20ODE%20Models%20with%20Different%20Iteration%20Counts&rft.jtitle=IEICE%20Transactions%20on%20Information%20and%20Systems&rft.au=HOSHINO,%20Yuto&rft.date=2024-06-01&rft.volume=E107.D&rft.issue=6&rft.spage=781&rft.epage=791&rft.pages=781-791&rft.artnum=2023EDP7176&rft.issn=0916-8532&rft.eissn=1745-1361&rft_id=info:doi/10.1587/transinf.2023EDP7176&rft_dat=%3Cproquest_cross%3E3080101510%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3080101510&rft_id=info:pmid/&rfr_iscdi=true |