FedUTN: federated self-supervised learning with updating target network
Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023-05, Vol.53 (9), p.10879-10892 |
---|---|
Hauptverfasser: | , , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 10892 |
---|---|
container_issue | 9 |
container_start_page | 10879 |
container_title | Applied intelligence (Dordrecht, Netherlands) |
container_volume | 53 |
creator | Li, Simou Mao, Yuxing Li, Jian Xu, Yihang Li, Jinsen Chen, Xueshuo Liu, Siyang Zhao, Xianping |
description | Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the sharing of decentralized, unlabeled data generated by a variety of mobile devices, such as cameras, phones, and other terminals. Federated Self-supervised Learning (FedSSL) is the result of recent efforts to create Federated learning, which is always used for supervised learning using SSL. Informed by past work, we propose a new FedSSL framework, FedUTN. This framework aims to permit each client to train a model that works well on both independent and identically distributed (IID) and independent and non-identically distributed (non-IID) data. Each party possesses two asymmetrical networks, a target network and an online network. FedUTN first aggregates the online network parameters of each terminal and then updates the terminals’ target network with the aggregated parameters, which is a radical departure from the update technique utilized in earlier studies. In conjunction with this method, we offer a novel control algorithm to replace EMA for the training operation. After extensive trials, we demonstrate that: (1) the feasibility of utilizing the aggregated online network to update the target network. (2) FedUTN’s aggregation strategy is simpler, more effective, and more robust. (3) FedUTN outperforms all other prevalent FedSSL algorithms and outperforms the SOTA algorithm by 0.5%
∼
1.6% under regular experiment con1ditions. |
doi_str_mv | 10.1007/s10489-022-04070-6 |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2815842562</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2815842562</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-8c0a373ceca9d4917f8c42d72ca9003c30610b65e69a808f687a2e4aa24303aa3</originalsourceid><addsrcrecordid>eNp9kF1LwzAUhoMoOKd_wKuC19GTjyapdzJ0CkNvNvAuxPZ0ds62JqnDf29mBe-8OrzwfhweQs4ZXDIAfRUYSFNQ4JyCBA1UHZAJy7WgWhb6kEyg4JIqVTwfk5MQNgAgBLAJmd9htVo-Xmc1VuhdxCoLuK1pGHr0n01IeovOt027znZNfM2GvnJxr6Lza4xZi3HX-bdTclS7bcCz3zslq7vb5eyeLp7mD7ObBS25hkhNCU5oUWLpikoWTNemlLzSPOn0UilAMXhROarCGTC1MtpxlM5xKUA4J6bkYuztffcxYIh20w2-TZOWG5YbyXPFk4uPrtJ3IXisbe-bd-e_LAO7B2ZHYDYBsz_ArEohMYZCMrdr9H_V_6S-AYuhbaM</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2815842562</pqid></control><display><type>article</type><title>FedUTN: federated self-supervised learning with updating target network</title><source>SpringerLink Journals - AutoHoldings</source><creator>Li, Simou ; Mao, Yuxing ; Li, Jian ; Xu, Yihang ; Li, Jinsen ; Chen, Xueshuo ; Liu, Siyang ; Zhao, Xianping</creator><creatorcontrib>Li, Simou ; Mao, Yuxing ; Li, Jian ; Xu, Yihang ; Li, Jinsen ; Chen, Xueshuo ; Liu, Siyang ; Zhao, Xianping</creatorcontrib><description>Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the sharing of decentralized, unlabeled data generated by a variety of mobile devices, such as cameras, phones, and other terminals. Federated Self-supervised Learning (FedSSL) is the result of recent efforts to create Federated learning, which is always used for supervised learning using SSL. Informed by past work, we propose a new FedSSL framework, FedUTN. This framework aims to permit each client to train a model that works well on both independent and identically distributed (IID) and independent and non-identically distributed (non-IID) data. Each party possesses two asymmetrical networks, a target network and an online network. FedUTN first aggregates the online network parameters of each terminal and then updates the terminals’ target network with the aggregated parameters, which is a radical departure from the update technique utilized in earlier studies. In conjunction with this method, we offer a novel control algorithm to replace EMA for the training operation. After extensive trials, we demonstrate that: (1) the feasibility of utilizing the aggregated online network to update the target network. (2) FedUTN’s aggregation strategy is simpler, more effective, and more robust. (3) FedUTN outperforms all other prevalent FedSSL algorithms and outperforms the SOTA algorithm by 0.5%
∼
1.6% under regular experiment con1ditions.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-022-04070-6</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Artificial Intelligence ; Computer Science ; Control algorithms ; Control theory ; Electronic devices ; Machines ; Manufacturing ; Mechanical Engineering ; Parameters ; Processes ; Self-supervised learning ; Terminals</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2023-05, Vol.53 (9), p.10879-10892</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022. Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-8c0a373ceca9d4917f8c42d72ca9003c30610b65e69a808f687a2e4aa24303aa3</cites><orcidid>0000-0002-1829-4719</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-022-04070-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-022-04070-6$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids></links><search><creatorcontrib>Li, Simou</creatorcontrib><creatorcontrib>Mao, Yuxing</creatorcontrib><creatorcontrib>Li, Jian</creatorcontrib><creatorcontrib>Xu, Yihang</creatorcontrib><creatorcontrib>Li, Jinsen</creatorcontrib><creatorcontrib>Chen, Xueshuo</creatorcontrib><creatorcontrib>Liu, Siyang</creatorcontrib><creatorcontrib>Zhao, Xianping</creatorcontrib><title>FedUTN: federated self-supervised learning with updating target network</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the sharing of decentralized, unlabeled data generated by a variety of mobile devices, such as cameras, phones, and other terminals. Federated Self-supervised Learning (FedSSL) is the result of recent efforts to create Federated learning, which is always used for supervised learning using SSL. Informed by past work, we propose a new FedSSL framework, FedUTN. This framework aims to permit each client to train a model that works well on both independent and identically distributed (IID) and independent and non-identically distributed (non-IID) data. Each party possesses two asymmetrical networks, a target network and an online network. FedUTN first aggregates the online network parameters of each terminal and then updates the terminals’ target network with the aggregated parameters, which is a radical departure from the update technique utilized in earlier studies. In conjunction with this method, we offer a novel control algorithm to replace EMA for the training operation. After extensive trials, we demonstrate that: (1) the feasibility of utilizing the aggregated online network to update the target network. (2) FedUTN’s aggregation strategy is simpler, more effective, and more robust. (3) FedUTN outperforms all other prevalent FedSSL algorithms and outperforms the SOTA algorithm by 0.5%
∼
1.6% under regular experiment con1ditions.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Control algorithms</subject><subject>Control theory</subject><subject>Electronic devices</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Parameters</subject><subject>Processes</subject><subject>Self-supervised learning</subject><subject>Terminals</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp9kF1LwzAUhoMoOKd_wKuC19GTjyapdzJ0CkNvNvAuxPZ0ds62JqnDf29mBe-8OrzwfhweQs4ZXDIAfRUYSFNQ4JyCBA1UHZAJy7WgWhb6kEyg4JIqVTwfk5MQNgAgBLAJmd9htVo-Xmc1VuhdxCoLuK1pGHr0n01IeovOt027znZNfM2GvnJxr6Lza4xZi3HX-bdTclS7bcCz3zslq7vb5eyeLp7mD7ObBS25hkhNCU5oUWLpikoWTNemlLzSPOn0UilAMXhROarCGTC1MtpxlM5xKUA4J6bkYuztffcxYIh20w2-TZOWG5YbyXPFk4uPrtJ3IXisbe-bd-e_LAO7B2ZHYDYBsz_ArEohMYZCMrdr9H_V_6S-AYuhbaM</recordid><startdate>20230501</startdate><enddate>20230501</enddate><creator>Li, Simou</creator><creator>Mao, Yuxing</creator><creator>Li, Jian</creator><creator>Xu, Yihang</creator><creator>Li, Jinsen</creator><creator>Chen, Xueshuo</creator><creator>Liu, Siyang</creator><creator>Zhao, Xianping</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-1829-4719</orcidid></search><sort><creationdate>20230501</creationdate><title>FedUTN: federated self-supervised learning with updating target network</title><author>Li, Simou ; Mao, Yuxing ; Li, Jian ; Xu, Yihang ; Li, Jinsen ; Chen, Xueshuo ; Liu, Siyang ; Zhao, Xianping</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-8c0a373ceca9d4917f8c42d72ca9003c30610b65e69a808f687a2e4aa24303aa3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Control algorithms</topic><topic>Control theory</topic><topic>Electronic devices</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Parameters</topic><topic>Processes</topic><topic>Self-supervised learning</topic><topic>Terminals</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Simou</creatorcontrib><creatorcontrib>Mao, Yuxing</creatorcontrib><creatorcontrib>Li, Jian</creatorcontrib><creatorcontrib>Xu, Yihang</creatorcontrib><creatorcontrib>Li, Jinsen</creatorcontrib><creatorcontrib>Chen, Xueshuo</creatorcontrib><creatorcontrib>Liu, Siyang</creatorcontrib><creatorcontrib>Zhao, Xianping</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Simou</au><au>Mao, Yuxing</au><au>Li, Jian</au><au>Xu, Yihang</au><au>Li, Jinsen</au><au>Chen, Xueshuo</au><au>Liu, Siyang</au><au>Zhao, Xianping</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>FedUTN: federated self-supervised learning with updating target network</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2023-05-01</date><risdate>2023</risdate><volume>53</volume><issue>9</issue><spage>10879</spage><epage>10892</epage><pages>10879-10892</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>Self-supervised learning (SSL) is capable of learning noteworthy representations from unlabeled data, which has mitigated the problem of insufficient labeled data to a certain extent. The original SSL method centered on centralized data, but the growing awareness of privacy protection restricts the sharing of decentralized, unlabeled data generated by a variety of mobile devices, such as cameras, phones, and other terminals. Federated Self-supervised Learning (FedSSL) is the result of recent efforts to create Federated learning, which is always used for supervised learning using SSL. Informed by past work, we propose a new FedSSL framework, FedUTN. This framework aims to permit each client to train a model that works well on both independent and identically distributed (IID) and independent and non-identically distributed (non-IID) data. Each party possesses two asymmetrical networks, a target network and an online network. FedUTN first aggregates the online network parameters of each terminal and then updates the terminals’ target network with the aggregated parameters, which is a radical departure from the update technique utilized in earlier studies. In conjunction with this method, we offer a novel control algorithm to replace EMA for the training operation. After extensive trials, we demonstrate that: (1) the feasibility of utilizing the aggregated online network to update the target network. (2) FedUTN’s aggregation strategy is simpler, more effective, and more robust. (3) FedUTN outperforms all other prevalent FedSSL algorithms and outperforms the SOTA algorithm by 0.5%
∼
1.6% under regular experiment con1ditions.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-022-04070-6</doi><tpages>14</tpages><orcidid>https://orcid.org/0000-0002-1829-4719</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-669X |
ispartof | Applied intelligence (Dordrecht, Netherlands), 2023-05, Vol.53 (9), p.10879-10892 |
issn | 0924-669X 1573-7497 |
language | eng |
recordid | cdi_proquest_journals_2815842562 |
source | SpringerLink Journals - AutoHoldings |
subjects | Algorithms Artificial Intelligence Computer Science Control algorithms Control theory Electronic devices Machines Manufacturing Mechanical Engineering Parameters Processes Self-supervised learning Terminals |
title | FedUTN: federated self-supervised learning with updating target network |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-08T18%3A12%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=FedUTN:%20federated%20self-supervised%20learning%20with%20updating%20target%20network&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Li,%20Simou&rft.date=2023-05-01&rft.volume=53&rft.issue=9&rft.spage=10879&rft.epage=10892&rft.pages=10879-10892&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-022-04070-6&rft_dat=%3Cproquest_cross%3E2815842562%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2815842562&rft_id=info:pmid/&rfr_iscdi=true |