SBPF: Sensitiveness Based Pruning Framework For Convolutional Neural Network On Image Classification
Pruning techniques are used comprehensively to compress convolutional neural networks (CNNs) on image classification. However, the majority of pruning methods require a well pre-trained model to provide useful supporting parameters, such as C1-norm, BatchNorm value and gradient information, which ma...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Pruning techniques are used comprehensively to compress convolutional neural
networks (CNNs) on image classification. However, the majority of pruning
methods require a well pre-trained model to provide useful supporting
parameters, such as C1-norm, BatchNorm value and gradient information, which
may lead to inconsistency of filter evaluation if the parameters of the
pre-trained model are not well optimized. Therefore, we propose a sensitiveness
based method to evaluate the importance of each layer from the perspective of
inference accuracy by adding extra damage for the original model. Because the
performance of the accuracy is determined by the distribution of parameters
across all layers rather than individual parameter, the sensitiveness based
method will be robust to update of parameters. Namely, we can obtain similar
importance evaluation of each convolutional layer between the imperfect-trained
and fully trained models. For VGG-16 on CIFAR-10, even when the original model
is only trained with 50 epochs, we can get same evaluation of layer importance
as the results when the model is trained fully. Then we will remove filters
proportional from each layer by the quantified sensitiveness. Our sensitiveness
based pruning framework is verified efficiently on VGG-16, a customized Conv-4
and ResNet-18 with CIFAR-10, MNIST and CIFAR-100, respectively. |
---|---|
DOI: | 10.48550/arxiv.2208.04588 |