Knowledge Distillation in Fourier Frequency Domain for Dense Prediction

Knowledge distillation has been widely used to enhance student network performance for dense prediction tasks. Most previous knowledge distillation methods focus on valuable regions of the feature map in the spatial domain, ignoring the semantic information in the frequency domain. This work explore...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE signal processing letters 2025, Vol.32, p.296-300
Hauptverfasser: Shi, Min, Zheng, Chengkun, Yi, Qingming, Weng, Jian, Luo, Aiwen
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Knowledge distillation has been widely used to enhance student network performance for dense prediction tasks. Most previous knowledge distillation methods focus on valuable regions of the feature map in the spatial domain, ignoring the semantic information in the frequency domain. This work explores effective information representation of feature maps in the frequency domain and proposes a novel distillation method in the Fourier domain. This approach enhances the student's amplitude representation and transmits both original feature knowledge and global pixel relations. Experiments on object detection and semantic segmentation tasks, including both homogeneous distillation and heterogeneous distillation, demonstrate the significant improvement for the student network. For instance, the ResNet50-RepPoints detector and ResNet18-PspNet segmenter achieve 4.2% AP and 5.01% mIoU improvements on COCO2017 and CityScapes datasets, respectively.
ISSN:1070-9908
1558-2361
DOI:10.1109/LSP.2024.3515795