Clothing Image Classification with a Dragonfly Algorithm Optimised Online Sequential Extreme Learning Machine
This study proposes a solution for the issue of the low classification accuracy of clothing images. Using Fashion-MNIST as the clothing image dataset, we propose a clothing image classification technology based on an online sequential extreme learning machine (OSELM) optimised by the dragonfly algor...
Gespeichert in:
Veröffentlicht in: | Fibres & textiles in Eastern Europe 2021-06, Vol.29 (3(147)), p.91-96 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This study proposes a solution for the issue of the low classification accuracy of clothing images. Using Fashion-MNIST as the clothing image dataset, we propose a clothing image classification technology based on an online sequential extreme learning machine (OSELM) optimised by the dragonfly algorithm (DA). First, we transform the Fashion-MNIST dataset into a data set that we extract from the corresponding grey image. Then, considering that the input weight and hidden layer bias of an OSELM are generated randomly, a DA is proposed to optimise the input weight and hidden layer bias of the OSELM to reduce the influence of random generation on the classification results. Finally, the optimised OSELM is applied to the clothing image classification. Compared to the other seven types of classification algorithms, the proposed clothing image classification model with the DA-optimised OSELM reached 93.98% accuracy when it contained 350 hidden nodes. Its performance was superior to other algorithms that were configured with the same number of hidden nodes. From a stability analysis of the box-plot, it was found that there were no outliers exhibited by the DA-OSELM model, whereas some other models had outliers or had lower stability compared to the model proposed, thereby validating the efficacy of the solution proposed. |
---|---|
ISSN: | 1230-3666 2300-7354 |
DOI: | 10.5604/01.3001.0014.7793 |