Filter Pruning by Switching to Neighboring CNNs with Good Attributes

Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2022-02
Hauptverfasser: He, Yang, Liu, Ping, Zhu, Linchao, Yang, Yi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator He, Yang
Liu, Ping
Zhu, Linchao
Yang, Yi
description Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose Meta-attribute-based Filter Pruning (MFP). First, to expand the existing magnitude information based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.
doi_str_mv 10.48550/arxiv.1904.03961
format Article
fullrecord <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_1904_03961</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2205746231</sourcerecordid><originalsourceid>FETCH-LOGICAL-a521-ec00294ec5b2b95fb5b87fd5717046832cb00ed039a0450b04771f1b76b9413e3</originalsourceid><addsrcrecordid>eNotj19LwzAUxYMgOOY-gE8GfG69-de0j6O6KYwquPfStOmaMZuZpOq-ve3m0-FwDveeH0J3BGKeCgGPlfs13zHJgMfAsoRcoRlljEQpp_QGLbzfAwBNJBWCzdDTyhyCdvjdDb3pd1id8MePCXU3mWBxoc2uU9ZNNi8Kj8eww2trG7wMwRk1BO1v0XVbHbxe_OscbVfP2_wl2rytX_PlJqoEJZGux78Z17VQVGWiVUKlsm2EJBJ4kjJaKwDdjKMr4AIUcClJS5RMVMYJ02yO7i9nz4jl0ZnPyp3KCbU8o46Nh0vj6OzXoH0o93Zw_bippBSE5AllhP0B5-5VAA</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2205746231</pqid></control><display><type>article</type><title>Filter Pruning by Switching to Neighboring CNNs with Good Attributes</title><source>arXiv.org</source><source>Free E- Journals</source><creator>He, Yang ; Liu, Ping ; Zhu, Linchao ; Yang, Yi</creator><creatorcontrib>He, Yang ; Liu, Ping ; Zhu, Linchao ; Yang, Yi</creatorcontrib><description>Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose Meta-attribute-based Filter Pruning (MFP). First, to expand the existing magnitude information based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.1904.03961</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Artificial neural networks ; Computer Science - Computer Vision and Pattern Recognition ; Criteria ; Image classification ; Neural networks ; Pruning ; Switching theory</subject><ispartof>arXiv.org, 2022-02</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,784,885,27925</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.1904.03961$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.1109/TNNLS.2022.3149332$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>He, Yang</creatorcontrib><creatorcontrib>Liu, Ping</creatorcontrib><creatorcontrib>Zhu, Linchao</creatorcontrib><creatorcontrib>Yang, Yi</creatorcontrib><title>Filter Pruning by Switching to Neighboring CNNs with Good Attributes</title><title>arXiv.org</title><description>Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose Meta-attribute-based Filter Pruning (MFP). First, to expand the existing magnitude information based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.</description><subject>Artificial neural networks</subject><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Criteria</subject><subject>Image classification</subject><subject>Neural networks</subject><subject>Pruning</subject><subject>Switching theory</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotj19LwzAUxYMgOOY-gE8GfG69-de0j6O6KYwquPfStOmaMZuZpOq-ve3m0-FwDveeH0J3BGKeCgGPlfs13zHJgMfAsoRcoRlljEQpp_QGLbzfAwBNJBWCzdDTyhyCdvjdDb3pd1id8MePCXU3mWBxoc2uU9ZNNi8Kj8eww2trG7wMwRk1BO1v0XVbHbxe_OscbVfP2_wl2rytX_PlJqoEJZGux78Z17VQVGWiVUKlsm2EJBJ4kjJaKwDdjKMr4AIUcClJS5RMVMYJ02yO7i9nz4jl0ZnPyp3KCbU8o46Nh0vj6OzXoH0o93Zw_bippBSE5AllhP0B5-5VAA</recordid><startdate>20220211</startdate><enddate>20220211</enddate><creator>He, Yang</creator><creator>Liu, Ping</creator><creator>Zhu, Linchao</creator><creator>Yang, Yi</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220211</creationdate><title>Filter Pruning by Switching to Neighboring CNNs with Good Attributes</title><author>He, Yang ; Liu, Ping ; Zhu, Linchao ; Yang, Yi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a521-ec00294ec5b2b95fb5b87fd5717046832cb00ed039a0450b04771f1b76b9413e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Artificial neural networks</topic><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Criteria</topic><topic>Image classification</topic><topic>Neural networks</topic><topic>Pruning</topic><topic>Switching theory</topic><toplevel>online_resources</toplevel><creatorcontrib>He, Yang</creatorcontrib><creatorcontrib>Liu, Ping</creatorcontrib><creatorcontrib>Zhu, Linchao</creatorcontrib><creatorcontrib>Yang, Yi</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>He, Yang</au><au>Liu, Ping</au><au>Zhu, Linchao</au><au>Yang, Yi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Filter Pruning by Switching to Neighboring CNNs with Good Attributes</atitle><jtitle>arXiv.org</jtitle><date>2022-02-11</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>Filter pruning is effective to reduce the computational costs of neural networks. Existing methods show that updating the previous pruned filter would enable large model capacity and achieve better performance. However, during the iterative pruning process, even if the network weights are updated to new values, the pruning criterion remains the same. In addition, when evaluating the filter importance, only the magnitude information of the filters is considered. However, in neural networks, filters do not work individually, but they would affect other filters. As a result, the magnitude information of each filter, which merely reflects the information of an individual filter itself, is not enough to judge the filter importance. To solve the above problems, we propose Meta-attribute-based Filter Pruning (MFP). First, to expand the existing magnitude information based pruning criteria, we introduce a new set of criteria to consider the geometric distance of filters. Additionally, to explicitly assess the current state of the network, we adaptively select the most suitable criteria for pruning via a meta-attribute, a property of the neural network at the current state. Experiments on two image classification benchmarks validate our method. For ResNet-50 on ILSVRC-2012, we could reduce more than 50% FLOPs with only 0.44% top-5 accuracy loss.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.1904.03961</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2022-02
issn 2331-8422
language eng
recordid cdi_arxiv_primary_1904_03961
source arXiv.org; Free E- Journals
subjects Artificial neural networks
Computer Science - Computer Vision and Pattern Recognition
Criteria
Image classification
Neural networks
Pruning
Switching theory
title Filter Pruning by Switching to Neighboring CNNs with Good Attributes
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-23T15%3A49%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Filter%20Pruning%20by%20Switching%20to%20Neighboring%20CNNs%20with%20Good%20Attributes&rft.jtitle=arXiv.org&rft.au=He,%20Yang&rft.date=2022-02-11&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.1904.03961&rft_dat=%3Cproquest_arxiv%3E2205746231%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2205746231&rft_id=info:pmid/&rfr_iscdi=true