AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction

Structured pruning methods are developed to bridge the gap between the massive scale of neural networks and the limited hardware resources. Most current structured pruning methods rely on training datasets to fine-tune the compressed model, resulting in high computational burdens and being inapplica...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2024-03
Hauptverfasser: Li, Siqi, Chen, Jun, Jingyang Xiang, Zhu, Chengrui, Liu, Yong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Li, Siqi
Chen, Jun
Jingyang Xiang
Zhu, Chengrui
Liu, Yong
description Structured pruning methods are developed to bridge the gap between the massive scale of neural networks and the limited hardware resources. Most current structured pruning methods rely on training datasets to fine-tune the compressed model, resulting in high computational burdens and being inapplicable for scenarios with stringent requirements on privacy and security. As an alternative, some data-free methods have been proposed, however, these methods often require handcraft parameter tuning and can only achieve inflexible reconstruction. In this paper, we propose the Automatic Data-Free Pruning (AutoDFP) method that achieves automatic pruning and reconstruction without fine-tuning. Our approach is based on the assumption that the loss of information can be partially compensated by retaining focused information from similar channels. Specifically, We formulate data-free pruning as an optimization problem, which can be effectively addressed through reinforcement learning. AutoDFP assesses the similarity of channels for each layer and provides this information to the reinforcement learning agent, guiding the pruning and reconstruction process of the network. We evaluate AutoDFP with multiple networks on multiple datasets, achieving impressive compression results. For instance, on the CIFAR-10 dataset, AutoDFP demonstrates a 2.87\% reduction in accuracy loss compared to the recently proposed data-free pruning method DFPC with fewer FLOPs on VGG-16. Furthermore, on the ImageNet dataset, AutoDFP achieves 43.17\% higher accuracy than the SOTA method with the same 80\% preserved ratio on MobileNet-V1.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2956943397</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2956943397</sourcerecordid><originalsourceid>FETCH-proquest_journals_29569433973</originalsourceid><addsrcrecordid>eNqNiksKwjAUAIMgWLR3CLgu1KQf605ai7gq6r48StRX2kSTF8Hbq-ABXM3AzIQFQspVtE6EmLHQuT6OY5HlIk1lwA5bT6aqmw3_ygiEHa-AIKqtUryxXqO-8icCL2-gtRr4CUccwCK9-FF1RjuyviM0esGmFxicCn-cs2W9O5f76G7NwytHbW-81Z_UiiLNikTKIpf_XW92wjyD</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2956943397</pqid></control><display><type>article</type><title>AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction</title><source>Free E- Journals</source><creator>Li, Siqi ; Chen, Jun ; Jingyang Xiang ; Zhu, Chengrui ; Liu, Yong</creator><creatorcontrib>Li, Siqi ; Chen, Jun ; Jingyang Xiang ; Zhu, Chengrui ; Liu, Yong</creatorcontrib><description>Structured pruning methods are developed to bridge the gap between the massive scale of neural networks and the limited hardware resources. Most current structured pruning methods rely on training datasets to fine-tune the compressed model, resulting in high computational burdens and being inapplicable for scenarios with stringent requirements on privacy and security. As an alternative, some data-free methods have been proposed, however, these methods often require handcraft parameter tuning and can only achieve inflexible reconstruction. In this paper, we propose the Automatic Data-Free Pruning (AutoDFP) method that achieves automatic pruning and reconstruction without fine-tuning. Our approach is based on the assumption that the loss of information can be partially compensated by retaining focused information from similar channels. Specifically, We formulate data-free pruning as an optimization problem, which can be effectively addressed through reinforcement learning. AutoDFP assesses the similarity of channels for each layer and provides this information to the reinforcement learning agent, guiding the pruning and reconstruction process of the network. We evaluate AutoDFP with multiple networks on multiple datasets, achieving impressive compression results. For instance, on the CIFAR-10 dataset, AutoDFP demonstrates a 2.87\% reduction in accuracy loss compared to the recently proposed data-free pruning method DFPC with fewer FLOPs on VGG-16. Furthermore, on the ImageNet dataset, AutoDFP achieves 43.17\% higher accuracy than the SOTA method with the same 80\% preserved ratio on MobileNet-V1.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Accuracy ; Channels ; Datasets ; Neural networks ; Pruning ; Reconstruction ; Similarity</subject><ispartof>arXiv.org, 2024-03</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Li, Siqi</creatorcontrib><creatorcontrib>Chen, Jun</creatorcontrib><creatorcontrib>Jingyang Xiang</creatorcontrib><creatorcontrib>Zhu, Chengrui</creatorcontrib><creatorcontrib>Liu, Yong</creatorcontrib><title>AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction</title><title>arXiv.org</title><description>Structured pruning methods are developed to bridge the gap between the massive scale of neural networks and the limited hardware resources. Most current structured pruning methods rely on training datasets to fine-tune the compressed model, resulting in high computational burdens and being inapplicable for scenarios with stringent requirements on privacy and security. As an alternative, some data-free methods have been proposed, however, these methods often require handcraft parameter tuning and can only achieve inflexible reconstruction. In this paper, we propose the Automatic Data-Free Pruning (AutoDFP) method that achieves automatic pruning and reconstruction without fine-tuning. Our approach is based on the assumption that the loss of information can be partially compensated by retaining focused information from similar channels. Specifically, We formulate data-free pruning as an optimization problem, which can be effectively addressed through reinforcement learning. AutoDFP assesses the similarity of channels for each layer and provides this information to the reinforcement learning agent, guiding the pruning and reconstruction process of the network. We evaluate AutoDFP with multiple networks on multiple datasets, achieving impressive compression results. For instance, on the CIFAR-10 dataset, AutoDFP demonstrates a 2.87\% reduction in accuracy loss compared to the recently proposed data-free pruning method DFPC with fewer FLOPs on VGG-16. Furthermore, on the ImageNet dataset, AutoDFP achieves 43.17\% higher accuracy than the SOTA method with the same 80\% preserved ratio on MobileNet-V1.</description><subject>Accuracy</subject><subject>Channels</subject><subject>Datasets</subject><subject>Neural networks</subject><subject>Pruning</subject><subject>Reconstruction</subject><subject>Similarity</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNiksKwjAUAIMgWLR3CLgu1KQf605ai7gq6r48StRX2kSTF8Hbq-ABXM3AzIQFQspVtE6EmLHQuT6OY5HlIk1lwA5bT6aqmw3_ygiEHa-AIKqtUryxXqO-8icCL2-gtRr4CUccwCK9-FF1RjuyviM0esGmFxicCn-cs2W9O5f76G7NwytHbW-81Z_UiiLNikTKIpf_XW92wjyD</recordid><startdate>20240313</startdate><enddate>20240313</enddate><creator>Li, Siqi</creator><creator>Chen, Jun</creator><creator>Jingyang Xiang</creator><creator>Zhu, Chengrui</creator><creator>Liu, Yong</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240313</creationdate><title>AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction</title><author>Li, Siqi ; Chen, Jun ; Jingyang Xiang ; Zhu, Chengrui ; Liu, Yong</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_29569433973</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Accuracy</topic><topic>Channels</topic><topic>Datasets</topic><topic>Neural networks</topic><topic>Pruning</topic><topic>Reconstruction</topic><topic>Similarity</topic><toplevel>online_resources</toplevel><creatorcontrib>Li, Siqi</creatorcontrib><creatorcontrib>Chen, Jun</creatorcontrib><creatorcontrib>Jingyang Xiang</creatorcontrib><creatorcontrib>Zhu, Chengrui</creatorcontrib><creatorcontrib>Liu, Yong</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Siqi</au><au>Chen, Jun</au><au>Jingyang Xiang</au><au>Zhu, Chengrui</au><au>Liu, Yong</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction</atitle><jtitle>arXiv.org</jtitle><date>2024-03-13</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Structured pruning methods are developed to bridge the gap between the massive scale of neural networks and the limited hardware resources. Most current structured pruning methods rely on training datasets to fine-tune the compressed model, resulting in high computational burdens and being inapplicable for scenarios with stringent requirements on privacy and security. As an alternative, some data-free methods have been proposed, however, these methods often require handcraft parameter tuning and can only achieve inflexible reconstruction. In this paper, we propose the Automatic Data-Free Pruning (AutoDFP) method that achieves automatic pruning and reconstruction without fine-tuning. Our approach is based on the assumption that the loss of information can be partially compensated by retaining focused information from similar channels. Specifically, We formulate data-free pruning as an optimization problem, which can be effectively addressed through reinforcement learning. AutoDFP assesses the similarity of channels for each layer and provides this information to the reinforcement learning agent, guiding the pruning and reconstruction process of the network. We evaluate AutoDFP with multiple networks on multiple datasets, achieving impressive compression results. For instance, on the CIFAR-10 dataset, AutoDFP demonstrates a 2.87\% reduction in accuracy loss compared to the recently proposed data-free pruning method DFPC with fewer FLOPs on VGG-16. Furthermore, on the ImageNet dataset, AutoDFP achieves 43.17\% higher accuracy than the SOTA method with the same 80\% preserved ratio on MobileNet-V1.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2024-03
issn 2331-8422
language eng
recordid cdi_proquest_journals_2956943397
source Free E- Journals
subjects Accuracy
Channels
Datasets
Neural networks
Pruning
Reconstruction
Similarity
title AutoDFP: Automatic Data-Free Pruning via Channel Similarity Reconstruction
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-04T02%3A29%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=AutoDFP:%20Automatic%20Data-Free%20Pruning%20via%20Channel%20Similarity%20Reconstruction&rft.jtitle=arXiv.org&rft.au=Li,%20Siqi&rft.date=2024-03-13&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2956943397%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2956943397&rft_id=info:pmid/&rfr_iscdi=true