Clothing fashion style recognition with design issue graph

Fashion style recognition of clothing images facilitates the clothing retrieval and recommendation in E-commerce. It is still a challenging task because the clothing images of same style may have diverse visual appearances. Existing fashion style recognition methods utilize deep neural networks to c...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2021-06, Vol.51 (6), p.3548-3560
Hauptverfasser: Yue, Xiaodong, Zhang, Cheng, Fujita, Hamido, Lv, Ying
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Fashion style recognition of clothing images facilitates the clothing retrieval and recommendation in E-commerce. It is still a challenging task because the clothing images of same style may have diverse visual appearances. Existing fashion style recognition methods utilize deep neural networks to classify clothing images based on pixel-level or region-level features. However, these features of local regions lack the semantics of fashion issues and make the style recognition sensitive to clothing appearance changing. To tackle this problem, we construct Design Issue Graphs (DIGs) with clothing attributes to form global and semantic representations of fashion styles, and propose a joint fashion style recognition model which consists of two convolutional neural networks based on clothing images and DIGs. The experiments on DeepFashion data sets validate that the proposed model is effective to recognize the clothing fashion styles of diverse appearances. The integration of DIGs into Deep Convolutional Neural Networks (DCNNs) achieves 1.75%, 0.99%, 1.03%, 1.53% improvements for multi-style recognition and 1.22%, 2.06%, 1.58%, 2.20% improvements for certain style recognition in the evaluations of accuracy, precision, recall and F1-score on average respectively.
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-020-01950-7