Semi-supervised transformable architecture search for feature distillation

The designed method aims to perform image classification tasks efficiently and accurately. Different from the traditional CNN-based image classification methods, which are greatly affected by the number of labels and the depth of the network. Although the deep network can improve the accuracy of the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Pattern analysis and applications : PAA 2023-05, Vol.26 (2), p.669-677
Hauptverfasser: Zhang, Man, Zhou, Yong, Liu, Bing, Zhao, Jiaqi, Yao, Rui, Shao, Zhiwen, Zhu, Hancheng, Chen, Hao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The designed method aims to perform image classification tasks efficiently and accurately. Different from the traditional CNN-based image classification methods, which are greatly affected by the number of labels and the depth of the network. Although the deep network can improve the accuracy of the model, the training process is usually time-consuming and laborious. We explained how to use only a few of labels, design a more flexible network architecture and combine feature distillation method to improve model efficiency while ensuring high accuracy. Specifically, we integrate different network structures into independent individuals to make the use of network structures more flexible. Based on knowledge distillation, we extract the channel features and establish a feature distillation connection from the teacher network to the student network. By comparing the experimental results with other related popular methods on commonly used data sets, the effectiveness of the method is proved. The code can be found at https://github.com/ZhangXinba/Semi_FD.
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-022-01122-y