Knowledge distillation method, device and equipment of mask auto-encoder and storage medium

The invention discloses a knowledge distillation method, device and equipment for a mask auto-encoder and a storage medium, and the method comprises the steps: building a teacher model and a student model of the mask auto-encoder, the teacher model and the student model are visual transformation mod...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: LIN LONG, CHU ZONGBO, DU ZEXU, LIU WEIWEI, ZHANG GUOLIANG, ZHANG YI
Format: Patent
Sprache:chi ; eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The invention discloses a knowledge distillation method, device and equipment for a mask auto-encoder and a storage medium, and the method comprises the steps: building a teacher model and a student model of the mask auto-encoder, the teacher model and the student model are visual transformation models, and the scale of the teacher model is larger than that of the student model; carrying out pre-training on the teacher model; performing knowledge distillation pre-training on the student model based on the pre-trained teacher model, so that the student model learns the data generalization ability from the pre-trained teacher model to obtain image features with better representation ability; fine tuning training is carried out on the pre-trained student model based on downstream tasks, the student model can be deployed on the power edge side lacking computing power resources, it is guaranteed that the model precision is not reduced while model parameters are reduced, and the real-time reasoning speed is increas