A semi‐supervised network based on feature embeddings for image classification

Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learn...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems 2022-05, Vol.39 (4), p.n/a
Hauptverfasser: Nuhoho, Raphael Elimeli, Wenyu, Chen, Baffour, Adu Asare
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page n/a
container_issue 4
container_start_page
container_title Expert systems
container_volume 39
creator Nuhoho, Raphael Elimeli
Wenyu, Chen
Baffour, Adu Asare
description Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learning. However, these methods rely on low‐level image features. This article departs from recent works and proposes a new semi‐supervised learning network that constitutes a convolutional branch and a neighbour cluster branch. Also, we introduce a new loss function that carefully optimizes the network according to the labelled/unlabelled data. In this way, we reduce any tendency to rely on low‐level features, which is the case in current methods. We use datasets from three different domains (hand‐written digits, natural images, and objects) to analyse the performance of our method. Experimental analysis shows that the network performs better by learning inherent discrimination features when integrating unlabelled data into the model's training process. Our proposed approach also provides strong generalization in the context of transfer learning. Finally, this study shows that the proposed loss function optimizes the network to produce more efficient feature embeddings for domain adaptation.
doi_str_mv 10.1111/exsy.12908
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2646149413</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2646149413</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3018-22c516b1ebeee185a57e2a47aa8310799ffa2e96abab7387c8126226e95717b3</originalsourceid><addsrcrecordid>eNp9kM9KAzEQh4MoWKsXnyDgTdiaye4m2WMp_oOCgj3oKSTbSUltd2uya-3NR_AZfRK3rmfnMgx8M_PjI-Qc2Ai6usKPuBsBL5g6IAPIhEpYWmSHZMC4EEkmOTsmJzEuGWMgpRiQxzGNuPbfn1-x3WB49xHntMJmW4dXas1-qivq0DRtQIpri_O5rxaRujpQvzYLpOXKxOidL03j6-qUHDmzinj214dkdnM9m9wl04fb-8l4mpQpA5VwXuYgLKBFRFC5ySVyk0ljVApMFoVzhmMhjDVWpkqWCrjgXGCRS5A2HZKL_uwm1G8txkYv6zZU3UfNRSYgKzJIO-qyp8pQxxjQ6U3oQoedBqb3wvRemP4V1sHQw1u_wt0_pL5-fnrpd34AdItvQw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2646149413</pqid></control><display><type>article</type><title>A semi‐supervised network based on feature embeddings for image classification</title><source>EBSCOhost Business Source Complete</source><source>Access via Wiley Online Library</source><creator>Nuhoho, Raphael Elimeli ; Wenyu, Chen ; Baffour, Adu Asare</creator><creatorcontrib>Nuhoho, Raphael Elimeli ; Wenyu, Chen ; Baffour, Adu Asare</creatorcontrib><description>Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learning. However, these methods rely on low‐level image features. This article departs from recent works and proposes a new semi‐supervised learning network that constitutes a convolutional branch and a neighbour cluster branch. Also, we introduce a new loss function that carefully optimizes the network according to the labelled/unlabelled data. In this way, we reduce any tendency to rely on low‐level features, which is the case in current methods. We use datasets from three different domains (hand‐written digits, natural images, and objects) to analyse the performance of our method. Experimental analysis shows that the network performs better by learning inherent discrimination features when integrating unlabelled data into the model's training process. Our proposed approach also provides strong generalization in the context of transfer learning. Finally, this study shows that the proposed loss function optimizes the network to produce more efficient feature embeddings for domain adaptation.</description><identifier>ISSN: 0266-4720</identifier><identifier>EISSN: 1468-0394</identifier><identifier>DOI: 10.1111/exsy.12908</identifier><language>eng</language><publisher>Oxford: Blackwell Publishing Ltd</publisher><subject>Artificial neural networks ; clustering ; convolutional neural network ; Deep learning ; Domains ; feature embedding ; Image classification ; Machine learning ; semi‐supervised learning</subject><ispartof>Expert systems, 2022-05, Vol.39 (4), p.n/a</ispartof><rights>2021 John Wiley &amp; Sons, Ltd</rights><rights>2022 John Wiley &amp; Sons, Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3018-22c516b1ebeee185a57e2a47aa8310799ffa2e96abab7387c8126226e95717b3</citedby><cites>FETCH-LOGICAL-c3018-22c516b1ebeee185a57e2a47aa8310799ffa2e96abab7387c8126226e95717b3</cites><orcidid>0000-0002-3349-8154</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://onlinelibrary.wiley.com/doi/pdf/10.1111%2Fexsy.12908$$EPDF$$P50$$Gwiley$$H</linktopdf><linktohtml>$$Uhttps://onlinelibrary.wiley.com/doi/full/10.1111%2Fexsy.12908$$EHTML$$P50$$Gwiley$$H</linktohtml><link.rule.ids>314,780,784,1417,27924,27925,45574,45575</link.rule.ids></links><search><creatorcontrib>Nuhoho, Raphael Elimeli</creatorcontrib><creatorcontrib>Wenyu, Chen</creatorcontrib><creatorcontrib>Baffour, Adu Asare</creatorcontrib><title>A semi‐supervised network based on feature embeddings for image classification</title><title>Expert systems</title><description>Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learning. However, these methods rely on low‐level image features. This article departs from recent works and proposes a new semi‐supervised learning network that constitutes a convolutional branch and a neighbour cluster branch. Also, we introduce a new loss function that carefully optimizes the network according to the labelled/unlabelled data. In this way, we reduce any tendency to rely on low‐level features, which is the case in current methods. We use datasets from three different domains (hand‐written digits, natural images, and objects) to analyse the performance of our method. Experimental analysis shows that the network performs better by learning inherent discrimination features when integrating unlabelled data into the model's training process. Our proposed approach also provides strong generalization in the context of transfer learning. Finally, this study shows that the proposed loss function optimizes the network to produce more efficient feature embeddings for domain adaptation.</description><subject>Artificial neural networks</subject><subject>clustering</subject><subject>convolutional neural network</subject><subject>Deep learning</subject><subject>Domains</subject><subject>feature embedding</subject><subject>Image classification</subject><subject>Machine learning</subject><subject>semi‐supervised learning</subject><issn>0266-4720</issn><issn>1468-0394</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kM9KAzEQh4MoWKsXnyDgTdiaye4m2WMp_oOCgj3oKSTbSUltd2uya-3NR_AZfRK3rmfnMgx8M_PjI-Qc2Ai6usKPuBsBL5g6IAPIhEpYWmSHZMC4EEkmOTsmJzEuGWMgpRiQxzGNuPbfn1-x3WB49xHntMJmW4dXas1-qivq0DRtQIpri_O5rxaRujpQvzYLpOXKxOidL03j6-qUHDmzinj214dkdnM9m9wl04fb-8l4mpQpA5VwXuYgLKBFRFC5ySVyk0ljVApMFoVzhmMhjDVWpkqWCrjgXGCRS5A2HZKL_uwm1G8txkYv6zZU3UfNRSYgKzJIO-qyp8pQxxjQ6U3oQoedBqb3wvRemP4V1sHQw1u_wt0_pL5-fnrpd34AdItvQw</recordid><startdate>202205</startdate><enddate>202205</enddate><creator>Nuhoho, Raphael Elimeli</creator><creator>Wenyu, Chen</creator><creator>Baffour, Adu Asare</creator><general>Blackwell Publishing Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7TB</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0000-0002-3349-8154</orcidid></search><sort><creationdate>202205</creationdate><title>A semi‐supervised network based on feature embeddings for image classification</title><author>Nuhoho, Raphael Elimeli ; Wenyu, Chen ; Baffour, Adu Asare</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3018-22c516b1ebeee185a57e2a47aa8310799ffa2e96abab7387c8126226e95717b3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>clustering</topic><topic>convolutional neural network</topic><topic>Deep learning</topic><topic>Domains</topic><topic>feature embedding</topic><topic>Image classification</topic><topic>Machine learning</topic><topic>semi‐supervised learning</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Nuhoho, Raphael Elimeli</creatorcontrib><creatorcontrib>Wenyu, Chen</creatorcontrib><creatorcontrib>Baffour, Adu Asare</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Expert systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nuhoho, Raphael Elimeli</au><au>Wenyu, Chen</au><au>Baffour, Adu Asare</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A semi‐supervised network based on feature embeddings for image classification</atitle><jtitle>Expert systems</jtitle><date>2022-05</date><risdate>2022</risdate><volume>39</volume><issue>4</issue><epage>n/a</epage><issn>0266-4720</issn><eissn>1468-0394</eissn><abstract>Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learning. However, these methods rely on low‐level image features. This article departs from recent works and proposes a new semi‐supervised learning network that constitutes a convolutional branch and a neighbour cluster branch. Also, we introduce a new loss function that carefully optimizes the network according to the labelled/unlabelled data. In this way, we reduce any tendency to rely on low‐level features, which is the case in current methods. We use datasets from three different domains (hand‐written digits, natural images, and objects) to analyse the performance of our method. Experimental analysis shows that the network performs better by learning inherent discrimination features when integrating unlabelled data into the model's training process. Our proposed approach also provides strong generalization in the context of transfer learning. Finally, this study shows that the proposed loss function optimizes the network to produce more efficient feature embeddings for domain adaptation.</abstract><cop>Oxford</cop><pub>Blackwell Publishing Ltd</pub><doi>10.1111/exsy.12908</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0002-3349-8154</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0266-4720
ispartof Expert systems, 2022-05, Vol.39 (4), p.n/a
issn 0266-4720
1468-0394
language eng
recordid cdi_proquest_journals_2646149413
source EBSCOhost Business Source Complete; Access via Wiley Online Library
subjects Artificial neural networks
clustering
convolutional neural network
Deep learning
Domains
feature embedding
Image classification
Machine learning
semi‐supervised learning
title A semi‐supervised network based on feature embeddings for image classification
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T19%3A49%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20semi%E2%80%90supervised%20network%20based%20on%20feature%20embeddings%20for%20image%20classification&rft.jtitle=Expert%20systems&rft.au=Nuhoho,%20Raphael%20Elimeli&rft.date=2022-05&rft.volume=39&rft.issue=4&rft.epage=n/a&rft.issn=0266-4720&rft.eissn=1468-0394&rft_id=info:doi/10.1111/exsy.12908&rft_dat=%3Cproquest_cross%3E2646149413%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2646149413&rft_id=info:pmid/&rfr_iscdi=true