OICSR: Out-In-Channel Sparsity Regularization for Compact Deep Neural Networks
Channel pruning can significantly accelerate and compress deep neural networks. Many channel pruning works utilize structured sparsity regularization to zero out all the weights in some channels and automatically obtain structure-sparse network in training stage. However, these methods apply structu...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Channel pruning can significantly accelerate and compress deep neural
networks. Many channel pruning works utilize structured sparsity regularization
to zero out all the weights in some channels and automatically obtain
structure-sparse network in training stage. However, these methods apply
structured sparsity regularization on each layer separately where the
correlations between consecutive layers are omitted. In this paper, we first
combine one out-channel in current layer and the corresponding in-channel in
next layer as a regularization group, namely out-in-channel. Our proposed
Out-In-Channel Sparsity Regularization (OICSR) considers correlations between
successive layers to further retain predictive power of the compact network.
Training with OICSR thoroughly transfers discriminative features into a
fraction of out-in-channels. Correspondingly, OICSR measures channel importance
based on statistics computed from two consecutive layers, not individual layer.
Finally, a global greedy pruning algorithm is designed to remove redundant
out-in-channels in an iterative way. Our method is comprehensively evaluated
with various CNN architectures including CifarNet, AlexNet, ResNet, DenseNet
and PreActSeNet on CIFAR-10, CIFAR-100 and ImageNet-1K datasets. Notably, on
ImageNet-1K, we reduce 37.2% FLOPs on ResNet-50 while outperforming the
original model by 0.22% top-1 accuracy. |
---|---|
DOI: | 10.48550/arxiv.1905.11664 |