Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Knowledge Distillation (KD), a learning manner with a larger teacher network guiding a smaller student network, transfers dark knowledge from the teacher to the student via logits or intermediate features, with the aim of producing a well-performed lightweight model. Notably, many subsequent feature...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Knowledge Distillation (KD), a learning manner with a larger teacher network
guiding a smaller student network, transfers dark knowledge from the teacher to
the student via logits or intermediate features, with the aim of producing a
well-performed lightweight model. Notably, many subsequent feature-based KD
methods outperformed the earliest logit-based KD method and iteratively
generated numerous state-of-the-art distillation methods. Nevertheless, recent
work has uncovered the potential of the logit-based method, bringing the simple
KD form based on logits back into the limelight. Features or logits? They
partially implement the KD with entirely distinct perspectives; therefore,
choosing between logits and features is not straightforward. This paper
provides a unified perspective of feature alignment in order to obtain a better
comprehension of their fundamental distinction. Inheriting the design
philosophy and insights of feature-based and logit-based methods, we introduce
a block-wise logit distillation framework to apply implicit logit-based feature
alignment by gradually replacing teacher's blocks as intermediate
stepping-stone models to bridge the gap between the student and the teacher.
Our method obtains comparable or superior results to state-of-the-art
distillation methods. This paper demonstrates the great potential of combining
logit and features, and we hope it will inspire future research to revisit KD
from a higher vantage point. |
---|---|
DOI: | 10.48550/arxiv.2411.01547 |