Overcoming Forgetting in Federated Learning on Non-IID Data
We tackle the problem of Federated Learning in the non i.i.d. case, in which local models drift apart, inhibiting learning. Building on an analogy with Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated Learning. We add a penalty term to the loss function, compelling all...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | |
container_volume | |
creator | Shoham, Neta Avidor, Tomer Keren, Aviv Israel, Nadav Benditkis, Daniel Mor-Yosef, Liron Zeitak, Itai |
description | We tackle the problem of Federated Learning in the non i.i.d. case, in which
local models drift apart, inhibiting learning. Building on an analogy with
Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated
Learning. We add a penalty term to the loss function, compelling all local
models to converge to a shared optimum. We show that this can be done
efficiently for communication (adding no further privacy risks), scaling with
the number of nodes in the distributed setting. Our experiments show that this
method is superior to competing ones for image recognition on the MNIST
dataset. |
doi_str_mv | 10.48550/arxiv.1910.07796 |
format | Article |
fullrecord | <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1910_07796</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1910_07796</sourcerecordid><originalsourceid>FETCH-LOGICAL-a676-bce52205ce32cb699bf86d4ae00693b5bfd62bfd74f69ec452bbde4c81a0948f3</originalsourceid><addsrcrecordid>eNotj7FuwjAYhL0wIOABmOoXSHAc24nFhKBpI0WwsEe_7d8oUuMgEyH69hDa5e70Dac7QtYZS0UpJdtAfHT3NNMvwIpCqznZnu4Y7dB34UKrIV5wHKfYBVqhwwgjOtogxDDRIdDjEJK6PtADjLAkMw8_N1z9-4Kcq8_z_jtpTl_1ftckoAqVGIuScyYt5twapbXxpXICkDGlcyONd4q_pBBeabRCcmMcCltmwLQofb4gH3-17_XtNXY9xN92etG-X-RPP7RB7w</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Overcoming Forgetting in Federated Learning on Non-IID Data</title><source>arXiv.org</source><creator>Shoham, Neta ; Avidor, Tomer ; Keren, Aviv ; Israel, Nadav ; Benditkis, Daniel ; Mor-Yosef, Liron ; Zeitak, Itai</creator><creatorcontrib>Shoham, Neta ; Avidor, Tomer ; Keren, Aviv ; Israel, Nadav ; Benditkis, Daniel ; Mor-Yosef, Liron ; Zeitak, Itai</creatorcontrib><description>We tackle the problem of Federated Learning in the non i.i.d. case, in which
local models drift apart, inhibiting learning. Building on an analogy with
Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated
Learning. We add a penalty term to the loss function, compelling all local
models to converge to a shared optimum. We show that this can be done
efficiently for communication (adding no further privacy risks), scaling with
the number of nodes in the distributed setting. Our experiments show that this
method is superior to competing ones for image recognition on the MNIST
dataset.</description><identifier>DOI: 10.48550/arxiv.1910.07796</identifier><language>eng</language><subject>Computer Science - Cryptography and Security ; Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2019-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1910.07796$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1910.07796$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Shoham, Neta</creatorcontrib><creatorcontrib>Avidor, Tomer</creatorcontrib><creatorcontrib>Keren, Aviv</creatorcontrib><creatorcontrib>Israel, Nadav</creatorcontrib><creatorcontrib>Benditkis, Daniel</creatorcontrib><creatorcontrib>Mor-Yosef, Liron</creatorcontrib><creatorcontrib>Zeitak, Itai</creatorcontrib><title>Overcoming Forgetting in Federated Learning on Non-IID Data</title><description>We tackle the problem of Federated Learning in the non i.i.d. case, in which
local models drift apart, inhibiting learning. Building on an analogy with
Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated
Learning. We add a penalty term to the loss function, compelling all local
models to converge to a shared optimum. We show that this can be done
efficiently for communication (adding no further privacy risks), scaling with
the number of nodes in the distributed setting. Our experiments show that this
method is superior to competing ones for image recognition on the MNIST
dataset.</description><subject>Computer Science - Cryptography and Security</subject><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj7FuwjAYhL0wIOABmOoXSHAc24nFhKBpI0WwsEe_7d8oUuMgEyH69hDa5e70Dac7QtYZS0UpJdtAfHT3NNMvwIpCqznZnu4Y7dB34UKrIV5wHKfYBVqhwwgjOtogxDDRIdDjEJK6PtADjLAkMw8_N1z9-4Kcq8_z_jtpTl_1ftckoAqVGIuScyYt5twapbXxpXICkDGlcyONd4q_pBBeabRCcmMcCltmwLQofb4gH3-17_XtNXY9xN92etG-X-RPP7RB7w</recordid><startdate>20191017</startdate><enddate>20191017</enddate><creator>Shoham, Neta</creator><creator>Avidor, Tomer</creator><creator>Keren, Aviv</creator><creator>Israel, Nadav</creator><creator>Benditkis, Daniel</creator><creator>Mor-Yosef, Liron</creator><creator>Zeitak, Itai</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20191017</creationdate><title>Overcoming Forgetting in Federated Learning on Non-IID Data</title><author>Shoham, Neta ; Avidor, Tomer ; Keren, Aviv ; Israel, Nadav ; Benditkis, Daniel ; Mor-Yosef, Liron ; Zeitak, Itai</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a676-bce52205ce32cb699bf86d4ae00693b5bfd62bfd74f69ec452bbde4c81a0948f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Cryptography and Security</topic><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Shoham, Neta</creatorcontrib><creatorcontrib>Avidor, Tomer</creatorcontrib><creatorcontrib>Keren, Aviv</creatorcontrib><creatorcontrib>Israel, Nadav</creatorcontrib><creatorcontrib>Benditkis, Daniel</creatorcontrib><creatorcontrib>Mor-Yosef, Liron</creatorcontrib><creatorcontrib>Zeitak, Itai</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Shoham, Neta</au><au>Avidor, Tomer</au><au>Keren, Aviv</au><au>Israel, Nadav</au><au>Benditkis, Daniel</au><au>Mor-Yosef, Liron</au><au>Zeitak, Itai</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Overcoming Forgetting in Federated Learning on Non-IID Data</atitle><date>2019-10-17</date><risdate>2019</risdate><abstract>We tackle the problem of Federated Learning in the non i.i.d. case, in which
local models drift apart, inhibiting learning. Building on an analogy with
Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated
Learning. We add a penalty term to the loss function, compelling all local
models to converge to a shared optimum. We show that this can be done
efficiently for communication (adding no further privacy risks), scaling with
the number of nodes in the distributed setting. Our experiments show that this
method is superior to competing ones for image recognition on the MNIST
dataset.</abstract><doi>10.48550/arxiv.1910.07796</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext_linktorsrc |
identifier | DOI: 10.48550/arxiv.1910.07796 |
ispartof | |
issn | |
language | eng |
recordid | cdi_arxiv_primary_1910_07796 |
source | arXiv.org |
subjects | Computer Science - Cryptography and Security Computer Science - Learning Statistics - Machine Learning |
title | Overcoming Forgetting in Federated Learning on Non-IID Data |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T00%3A46%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Overcoming%20Forgetting%20in%20Federated%20Learning%20on%20Non-IID%20Data&rft.au=Shoham,%20Neta&rft.date=2019-10-17&rft_id=info:doi/10.48550/arxiv.1910.07796&rft_dat=%3Carxiv_GOX%3E1910_07796%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |