Knowledge Distillation with Contrastive Inter-Class Relationship

Due to the high computational cost, the application of deep neural networks (DNNs) to the real-time tasks has been limited. A possible solution is to compress the size of the model so that the demand for computation resources can be decreased. A popular method is called knowledge distillation (KD)....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of physics. Conference series 2021-02, Vol.1756 (1), p.12001
Hauptverfasser: Yang, Chaoyi, Zeng, Jijun, Zhang, Jinbo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!