Advancements in Knowledge Distillation: Towards New Horizons of Intelligent Systems

The book provides a timely coverage of the paradigm of knowledge distillation-an efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a large teacher model. The book covers a vari...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Pedrycz, Witold, Chen, Shyi-Ming
Format: Buch
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!