Knowledge Distillation with Contrastive Inter-Class Relationship
Due to the high computational cost, the application of deep neural networks (DNNs) to the real-time tasks has been limited. A possible solution is to compress the size of the model so that the demand for computation resources can be decreased. A popular method is called knowledge distillation (KD)....
Gespeichert in:
Veröffentlicht in: | Journal of physics. Conference series 2021-02, Vol.1756 (1), p.12001 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Schreiben Sie den ersten Kommentar!