A semi‐supervised network based on feature embeddings for image classification

Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learn...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Expert systems 2022-05, Vol.39 (4), p.n/a
Hauptverfasser: Nuhoho, Raphael Elimeli, Wenyu, Chen, Baffour, Adu Asare
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep learning approaches, including convolutional neural networks, are suitable for image classification tasks with well‐labelled data. Unfortunately, we do not always have sufficiently labelled data. Recent methods attempt to leverage labelled and unlabelled data using fine‐tuning or transfer learning. However, these methods rely on low‐level image features. This article departs from recent works and proposes a new semi‐supervised learning network that constitutes a convolutional branch and a neighbour cluster branch. Also, we introduce a new loss function that carefully optimizes the network according to the labelled/unlabelled data. In this way, we reduce any tendency to rely on low‐level features, which is the case in current methods. We use datasets from three different domains (hand‐written digits, natural images, and objects) to analyse the performance of our method. Experimental analysis shows that the network performs better by learning inherent discrimination features when integrating unlabelled data into the model's training process. Our proposed approach also provides strong generalization in the context of transfer learning. Finally, this study shows that the proposed loss function optimizes the network to produce more efficient feature embeddings for domain adaptation.
ISSN:0266-4720
1468-0394
DOI:10.1111/exsy.12908