MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence

Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not independent and identically distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a cl...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE internet of things journal 2024-03, Vol.11 (5), p.8604-8623
Hauptverfasser: Qiao, Yu, Munir, Md. Shirajum, Adhikary, Apurba, Le, Huy Q., Raha, Avi Deb, Zhang, Chaoning, Hong, Choong Seon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 8623
container_issue 5
container_start_page 8604
container_title IEEE internet of things journal
container_volume 11
creator Qiao, Yu
Munir, Md. Shirajum
Adhikary, Apurba
Le, Huy Q.
Raha, Avi Deb
Zhang, Chaoning
Hong, Choong Seon
description Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not independent and identically distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a class by using the mean of the feature space. However, feature spaces are usually not clustered, and a single prototype may not represent a class well. Motivated by this, this article proposes a multiprototype federated contrastive learning approach (MP-FedCL) which demonstrates the effectiveness of using a multiprototype strategy over a single-prototype under non-IID settings, including both label and feature skewness. Specifically, a multiprototype computation strategy based on k-means is first proposed to capture different embedding representations for each class space, using multiple prototypes [Formula Omitted] centroids) to represent a class in the embedding space. In each global round, the computed multiple prototypes and their respective model parameters are sent to the edge server for aggregation into a global prototype pool, which is then sent back to all clients to guide their local training. Finally, local training for each client minimizes their own supervised learning tasks and learns from shared prototypes in the global prototype pool through supervised contrastive learning, which encourages them to learn knowledge related to their own class from others and reduces the absorption of unrelated knowledge in each global iteration. Experimental results on MNIST, Digit-5, Office-10, and DomainNet show that our method outperforms multiple baselines, with an average test accuracy improvement of about 4.6% and 10.4% under feature and label non-IID distributions, respectively.
doi_str_mv 10.1109/JIOT.2023.3320250
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2929257843</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2929257843</sourcerecordid><originalsourceid>FETCH-LOGICAL-c225t-714fea4d9bcd64e52650e6dd80df499ed352a602c4c6ef5fa8d6649f475a87b33</originalsourceid><addsrcrecordid>eNpNkE9Lw0AQxRdRsNR-AG8LnlP3f7LeJLRaSamHel622dmSEpO6uxX67U1oDzKHNwxv5g0_hB4pmVNK9PPHarOdM8L4nPNBJLlBE8ZZngml2O2__h7NYjwQQoY1SbWaoPX6M1uCK6sXvD61qTmGPvXpfAQ8TCHYBA6XfZeCjan5BVyBDV3T7bHvA164PeBVl6Btmz10NTygO2_bCLOrTtHXcrEt37Nq87YqX6usZkymLKfCgxVO72qnBEimJAHlXEGcF1qD45JZRVgtagVeels4pYT2Ipe2yHecT9HT5e7w7s8JYjKH_hS6IdIwPZTMCzG66MVVhz7GAN4cQ_Ntw9lQYkZwZgRnRnDmCo7_AX34YC0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2929257843</pqid></control><display><type>article</type><title>MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence</title><source>IEEE Electronic Library (IEL)</source><creator>Qiao, Yu ; Munir, Md. Shirajum ; Adhikary, Apurba ; Le, Huy Q. ; Raha, Avi Deb ; Zhang, Chaoning ; Hong, Choong Seon</creator><creatorcontrib>Qiao, Yu ; Munir, Md. Shirajum ; Adhikary, Apurba ; Le, Huy Q. ; Raha, Avi Deb ; Zhang, Chaoning ; Hong, Choong Seon</creatorcontrib><description>Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not independent and identically distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a class by using the mean of the feature space. However, feature spaces are usually not clustered, and a single prototype may not represent a class well. Motivated by this, this article proposes a multiprototype federated contrastive learning approach (MP-FedCL) which demonstrates the effectiveness of using a multiprototype strategy over a single-prototype under non-IID settings, including both label and feature skewness. Specifically, a multiprototype computation strategy based on k-means is first proposed to capture different embedding representations for each class space, using multiple prototypes [Formula Omitted] centroids) to represent a class in the embedding space. In each global round, the computed multiple prototypes and their respective model parameters are sent to the edge server for aggregation into a global prototype pool, which is then sent back to all clients to guide their local training. Finally, local training for each client minimizes their own supervised learning tasks and learns from shared prototypes in the global prototype pool through supervised contrastive learning, which encourages them to learn knowledge related to their own class from others and reduces the absorption of unrelated knowledge in each global iteration. Experimental results on MNIST, Digit-5, Office-10, and DomainNet show that our method outperforms multiple baselines, with an average test accuracy improvement of about 4.6% and 10.4% under feature and label non-IID distributions, respectively.</description><identifier>ISSN: 2327-4662</identifier><identifier>EISSN: 2327-4662</identifier><identifier>DOI: 10.1109/JIOT.2023.3320250</identifier><language>eng</language><publisher>Piscataway: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</publisher><subject>Centroids ; Clients ; Cognitive tasks ; Edge computing ; Embedding ; Intelligence ; Iterative methods ; Labels ; Prototypes ; Supervised learning</subject><ispartof>IEEE internet of things journal, 2024-03, Vol.11 (5), p.8604-8623</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c225t-714fea4d9bcd64e52650e6dd80df499ed352a602c4c6ef5fa8d6649f475a87b33</cites><orcidid>0000-0003-4045-8473 ; 0000-0003-3484-7333 ; 0009-0007-8342-7614 ; 0000-0003-3970-1878 ; 0000-0002-7255-1085</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Qiao, Yu</creatorcontrib><creatorcontrib>Munir, Md. Shirajum</creatorcontrib><creatorcontrib>Adhikary, Apurba</creatorcontrib><creatorcontrib>Le, Huy Q.</creatorcontrib><creatorcontrib>Raha, Avi Deb</creatorcontrib><creatorcontrib>Zhang, Chaoning</creatorcontrib><creatorcontrib>Hong, Choong Seon</creatorcontrib><title>MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence</title><title>IEEE internet of things journal</title><description>Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not independent and identically distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a class by using the mean of the feature space. However, feature spaces are usually not clustered, and a single prototype may not represent a class well. Motivated by this, this article proposes a multiprototype federated contrastive learning approach (MP-FedCL) which demonstrates the effectiveness of using a multiprototype strategy over a single-prototype under non-IID settings, including both label and feature skewness. Specifically, a multiprototype computation strategy based on k-means is first proposed to capture different embedding representations for each class space, using multiple prototypes [Formula Omitted] centroids) to represent a class in the embedding space. In each global round, the computed multiple prototypes and their respective model parameters are sent to the edge server for aggregation into a global prototype pool, which is then sent back to all clients to guide their local training. Finally, local training for each client minimizes their own supervised learning tasks and learns from shared prototypes in the global prototype pool through supervised contrastive learning, which encourages them to learn knowledge related to their own class from others and reduces the absorption of unrelated knowledge in each global iteration. Experimental results on MNIST, Digit-5, Office-10, and DomainNet show that our method outperforms multiple baselines, with an average test accuracy improvement of about 4.6% and 10.4% under feature and label non-IID distributions, respectively.</description><subject>Centroids</subject><subject>Clients</subject><subject>Cognitive tasks</subject><subject>Edge computing</subject><subject>Embedding</subject><subject>Intelligence</subject><subject>Iterative methods</subject><subject>Labels</subject><subject>Prototypes</subject><subject>Supervised learning</subject><issn>2327-4662</issn><issn>2327-4662</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkE9Lw0AQxRdRsNR-AG8LnlP3f7LeJLRaSamHel622dmSEpO6uxX67U1oDzKHNwxv5g0_hB4pmVNK9PPHarOdM8L4nPNBJLlBE8ZZngml2O2__h7NYjwQQoY1SbWaoPX6M1uCK6sXvD61qTmGPvXpfAQ8TCHYBA6XfZeCjan5BVyBDV3T7bHvA164PeBVl6Btmz10NTygO2_bCLOrTtHXcrEt37Nq87YqX6usZkymLKfCgxVO72qnBEimJAHlXEGcF1qD45JZRVgtagVeels4pYT2Ipe2yHecT9HT5e7w7s8JYjKH_hS6IdIwPZTMCzG66MVVhz7GAN4cQ_Ntw9lQYkZwZgRnRnDmCo7_AX34YC0</recordid><startdate>20240301</startdate><enddate>20240301</enddate><creator>Qiao, Yu</creator><creator>Munir, Md. Shirajum</creator><creator>Adhikary, Apurba</creator><creator>Le, Huy Q.</creator><creator>Raha, Avi Deb</creator><creator>Zhang, Chaoning</creator><creator>Hong, Choong Seon</creator><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0003-4045-8473</orcidid><orcidid>https://orcid.org/0000-0003-3484-7333</orcidid><orcidid>https://orcid.org/0009-0007-8342-7614</orcidid><orcidid>https://orcid.org/0000-0003-3970-1878</orcidid><orcidid>https://orcid.org/0000-0002-7255-1085</orcidid></search><sort><creationdate>20240301</creationdate><title>MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence</title><author>Qiao, Yu ; Munir, Md. Shirajum ; Adhikary, Apurba ; Le, Huy Q. ; Raha, Avi Deb ; Zhang, Chaoning ; Hong, Choong Seon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c225t-714fea4d9bcd64e52650e6dd80df499ed352a602c4c6ef5fa8d6649f475a87b33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Centroids</topic><topic>Clients</topic><topic>Cognitive tasks</topic><topic>Edge computing</topic><topic>Embedding</topic><topic>Intelligence</topic><topic>Iterative methods</topic><topic>Labels</topic><topic>Prototypes</topic><topic>Supervised learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Qiao, Yu</creatorcontrib><creatorcontrib>Munir, Md. Shirajum</creatorcontrib><creatorcontrib>Adhikary, Apurba</creatorcontrib><creatorcontrib>Le, Huy Q.</creatorcontrib><creatorcontrib>Raha, Avi Deb</creatorcontrib><creatorcontrib>Zhang, Chaoning</creatorcontrib><creatorcontrib>Hong, Choong Seon</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>IEEE internet of things journal</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Qiao, Yu</au><au>Munir, Md. Shirajum</au><au>Adhikary, Apurba</au><au>Le, Huy Q.</au><au>Raha, Avi Deb</au><au>Zhang, Chaoning</au><au>Hong, Choong Seon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence</atitle><jtitle>IEEE internet of things journal</jtitle><date>2024-03-01</date><risdate>2024</risdate><volume>11</volume><issue>5</issue><spage>8604</spage><epage>8623</epage><pages>8604-8623</pages><issn>2327-4662</issn><eissn>2327-4662</eissn><abstract>Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not independent and identically distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a class by using the mean of the feature space. However, feature spaces are usually not clustered, and a single prototype may not represent a class well. Motivated by this, this article proposes a multiprototype federated contrastive learning approach (MP-FedCL) which demonstrates the effectiveness of using a multiprototype strategy over a single-prototype under non-IID settings, including both label and feature skewness. Specifically, a multiprototype computation strategy based on k-means is first proposed to capture different embedding representations for each class space, using multiple prototypes [Formula Omitted] centroids) to represent a class in the embedding space. In each global round, the computed multiple prototypes and their respective model parameters are sent to the edge server for aggregation into a global prototype pool, which is then sent back to all clients to guide their local training. Finally, local training for each client minimizes their own supervised learning tasks and learns from shared prototypes in the global prototype pool through supervised contrastive learning, which encourages them to learn knowledge related to their own class from others and reduces the absorption of unrelated knowledge in each global iteration. Experimental results on MNIST, Digit-5, Office-10, and DomainNet show that our method outperforms multiple baselines, with an average test accuracy improvement of about 4.6% and 10.4% under feature and label non-IID distributions, respectively.</abstract><cop>Piscataway</cop><pub>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</pub><doi>10.1109/JIOT.2023.3320250</doi><tpages>20</tpages><orcidid>https://orcid.org/0000-0003-4045-8473</orcidid><orcidid>https://orcid.org/0000-0003-3484-7333</orcidid><orcidid>https://orcid.org/0009-0007-8342-7614</orcidid><orcidid>https://orcid.org/0000-0003-3970-1878</orcidid><orcidid>https://orcid.org/0000-0002-7255-1085</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 2327-4662
ispartof IEEE internet of things journal, 2024-03, Vol.11 (5), p.8604-8623
issn 2327-4662
2327-4662
language eng
recordid cdi_proquest_journals_2929257843
source IEEE Electronic Library (IEL)
subjects Centroids
Clients
Cognitive tasks
Edge computing
Embedding
Intelligence
Iterative methods
Labels
Prototypes
Supervised learning
title MP-FedCL: Multiprototype Federated Contrastive Learning for Edge Intelligence
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T19%3A30%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=MP-FedCL:%20Multiprototype%20Federated%20Contrastive%20Learning%20for%20Edge%20Intelligence&rft.jtitle=IEEE%20internet%20of%20things%20journal&rft.au=Qiao,%20Yu&rft.date=2024-03-01&rft.volume=11&rft.issue=5&rft.spage=8604&rft.epage=8623&rft.pages=8604-8623&rft.issn=2327-4662&rft.eissn=2327-4662&rft_id=info:doi/10.1109/JIOT.2023.3320250&rft_dat=%3Cproquest_cross%3E2929257843%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2929257843&rft_id=info:pmid/&rfr_iscdi=true