Lightweight Deep CNN for Natural Image Matting via Similarity-Preserving Knowledge Distillation
Recently, alpha matting has witnessed remarkable growth by wide and deep convolutional neural networks. However, previous deep learning-based alpha matting methods require a high computational cost to be used in real environments including mobile devices. In this letter, a lightweight natural image...
Gespeichert in:
Veröffentlicht in: | IEEE signal processing letters 2020, Vol.27, p.2139-2143 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recently, alpha matting has witnessed remarkable growth by wide and deep convolutional neural networks. However, previous deep learning-based alpha matting methods require a high computational cost to be used in real environments including mobile devices. In this letter, a lightweight natural image matting network with a similarity-preserving knowledge distillation is developed. The similarity-preserving knowledge distillation makes pairwise similarities from a compact student network similar to those from a teacher network. The pairwise similarity measured on spatial, channel, and batch units enables to transfer knowledge of the teacher to the student. Based on the similarity-preserving knowledge distillation, we not only design a lighter and smaller student network than the teacher one but also achieve superior performance compared to that of the student network without the knowledge distillation. In addition, the proposed algorithm can be seamlessly applied to various deep image matting algorithms. Therefore, our algorithm is effective for mobile applications ( e . g ., human portrait matting), which are in growing demand. The effectiveness of the proposed algorithm is verified on two public benchmark datasets. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2020.3039952 |