Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy

Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the archit...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Neural processing letters 2023-12, Vol.55 (7), p.8903-8926
Hauptverfasser: Yu, Xinyi, Wang, Xiaowei, Rong, Jintao, Zhang, Mingyang, Ou, Linlin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 8926
container_issue 7
container_start_page 8903
container_title Neural processing letters
container_volume 55
creator Yu, Xinyi
Wang, Xiaowei
Rong, Jintao
Zhang, Mingyang
Ou, Linlin
description Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the architecture performance to the type of operations and prior knowledge. At same time, some re-parameterization operations hinder the optimization of the network. To break these restrictions, in this work, an improved re-parameterization search space is designed, including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further enhanced by the search space. An automatic re-parameterization enhancement strategy is designed to effectively explore this search space based on neural architecture search (NAS), which can search an excellent re-parameterization architecture. Then, we solved the optimization problem caused by using some re-parameterization operations to enhance ResNet-style network. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.
doi_str_mv 10.1007/s11063-023-11184-6
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2918354193</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2918354193</sourcerecordid><originalsourceid>FETCH-LOGICAL-c363t-7baedc0ee6f695b7570ec33e4272e3ae4b1c6053974ccf519d2c5e1ad661b6f3</originalsourceid><addsrcrecordid>eNp9kD1PwzAURSMEEqXwB5gsMRvsOLGbEdrwIVVUoh3YLMd5KSlpHWwHFH49boPExvTucM_V04miS0quKSHixlFKOMMkZphSOkkwP4pGNBUMC8Fej0NmguCEx_Q0OnNuQ0jAYjKKuryqal3DzqMXwK2yagsebP2tfG12aNGCPSSHlqCsfkOVsShXrsfe4Bm0jenRM_gvY9_RnXJQokDNagt6T6kG5Z-m6Q7Z9mjpwxqs-_PopFKNg4vfO45W9_lq-ojni4en6e0ca8aZx6JQUGoCwCuepYVIBQHNGCSxiIEpSAqqOUlZJhKtq5RmZaxToKrknBa8YuPoaphtrfnowHm5MZ0NXzkZZ3TC0oRmLLTioaWtcc5CJVtbb8O7khK5tysHuzLYlQe7kgeIDZAL5d0a7N_0P9QPFIF_bw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918354193</pqid></control><display><type>article</type><title>Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy</title><source>SpringerLink Journals - AutoHoldings</source><source>ProQuest Central</source><creator>Yu, Xinyi ; Wang, Xiaowei ; Rong, Jintao ; Zhang, Mingyang ; Ou, Linlin</creator><creatorcontrib>Yu, Xinyi ; Wang, Xiaowei ; Rong, Jintao ; Zhang, Mingyang ; Ou, Linlin</creatorcontrib><description>Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the architecture performance to the type of operations and prior knowledge. At same time, some re-parameterization operations hinder the optimization of the network. To break these restrictions, in this work, an improved re-parameterization search space is designed, including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further enhanced by the search space. An automatic re-parameterization enhancement strategy is designed to effectively explore this search space based on neural architecture search (NAS), which can search an excellent re-parameterization architecture. Then, we solved the optimization problem caused by using some re-parameterization operations to enhance ResNet-style network. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.</description><identifier>ISSN: 1370-4621</identifier><identifier>EISSN: 1573-773X</identifier><identifier>DOI: 10.1007/s11063-023-11184-6</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Accuracy ; Artificial Intelligence ; Complex Systems ; Computational Intelligence ; Computer Science ; Datasets ; Genetic algorithms ; Optimization ; Parameterization ; Performance enhancement ; Searching</subject><ispartof>Neural processing letters, 2023-12, Vol.55 (7), p.8903-8926</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c363t-7baedc0ee6f695b7570ec33e4272e3ae4b1c6053974ccf519d2c5e1ad661b6f3</citedby><cites>FETCH-LOGICAL-c363t-7baedc0ee6f695b7570ec33e4272e3ae4b1c6053974ccf519d2c5e1ad661b6f3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11063-023-11184-6$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2918354193?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,776,780,21367,27901,27902,33721,41464,42533,43781,51294</link.rule.ids></links><search><creatorcontrib>Yu, Xinyi</creatorcontrib><creatorcontrib>Wang, Xiaowei</creatorcontrib><creatorcontrib>Rong, Jintao</creatorcontrib><creatorcontrib>Zhang, Mingyang</creatorcontrib><creatorcontrib>Ou, Linlin</creatorcontrib><title>Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy</title><title>Neural processing letters</title><addtitle>Neural Process Lett</addtitle><description>Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the architecture performance to the type of operations and prior knowledge. At same time, some re-parameterization operations hinder the optimization of the network. To break these restrictions, in this work, an improved re-parameterization search space is designed, including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further enhanced by the search space. An automatic re-parameterization enhancement strategy is designed to effectively explore this search space based on neural architecture search (NAS), which can search an excellent re-parameterization architecture. Then, we solved the optimization problem caused by using some re-parameterization operations to enhance ResNet-style network. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.</description><subject>Accuracy</subject><subject>Artificial Intelligence</subject><subject>Complex Systems</subject><subject>Computational Intelligence</subject><subject>Computer Science</subject><subject>Datasets</subject><subject>Genetic algorithms</subject><subject>Optimization</subject><subject>Parameterization</subject><subject>Performance enhancement</subject><subject>Searching</subject><issn>1370-4621</issn><issn>1573-773X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNp9kD1PwzAURSMEEqXwB5gsMRvsOLGbEdrwIVVUoh3YLMd5KSlpHWwHFH49boPExvTucM_V04miS0quKSHixlFKOMMkZphSOkkwP4pGNBUMC8Fej0NmguCEx_Q0OnNuQ0jAYjKKuryqal3DzqMXwK2yagsebP2tfG12aNGCPSSHlqCsfkOVsShXrsfe4Bm0jenRM_gvY9_RnXJQokDNagt6T6kG5Z-m6Q7Z9mjpwxqs-_PopFKNg4vfO45W9_lq-ojni4en6e0ca8aZx6JQUGoCwCuepYVIBQHNGCSxiIEpSAqqOUlZJhKtq5RmZaxToKrknBa8YuPoaphtrfnowHm5MZ0NXzkZZ3TC0oRmLLTioaWtcc5CJVtbb8O7khK5tysHuzLYlQe7kgeIDZAL5d0a7N_0P9QPFIF_bw</recordid><startdate>20231201</startdate><enddate>20231201</enddate><creator>Yu, Xinyi</creator><creator>Wang, Xiaowei</creator><creator>Rong, Jintao</creator><creator>Zhang, Mingyang</creator><creator>Ou, Linlin</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope></search><sort><creationdate>20231201</creationdate><title>Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy</title><author>Yu, Xinyi ; Wang, Xiaowei ; Rong, Jintao ; Zhang, Mingyang ; Ou, Linlin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c363t-7baedc0ee6f695b7570ec33e4272e3ae4b1c6053974ccf519d2c5e1ad661b6f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Artificial Intelligence</topic><topic>Complex Systems</topic><topic>Computational Intelligence</topic><topic>Computer Science</topic><topic>Datasets</topic><topic>Genetic algorithms</topic><topic>Optimization</topic><topic>Parameterization</topic><topic>Performance enhancement</topic><topic>Searching</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yu, Xinyi</creatorcontrib><creatorcontrib>Wang, Xiaowei</creatorcontrib><creatorcontrib>Rong, Jintao</creatorcontrib><creatorcontrib>Zhang, Mingyang</creatorcontrib><creatorcontrib>Ou, Linlin</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><jtitle>Neural processing letters</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yu, Xinyi</au><au>Wang, Xiaowei</au><au>Rong, Jintao</au><au>Zhang, Mingyang</au><au>Ou, Linlin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy</atitle><jtitle>Neural processing letters</jtitle><stitle>Neural Process Lett</stitle><date>2023-12-01</date><risdate>2023</risdate><volume>55</volume><issue>7</issue><spage>8903</spage><epage>8926</epage><pages>8903-8926</pages><issn>1370-4621</issn><eissn>1573-773X</eissn><abstract>Traditional NAS methods improve performance by sacrificing the landing ability of the architecture and the re-parameterization technology is expected to solve this problem. However, most current Rep methods rely on prior knowledge to select the re-parameterization operations, which limits the architecture performance to the type of operations and prior knowledge. At same time, some re-parameterization operations hinder the optimization of the network. To break these restrictions, in this work, an improved re-parameterization search space is designed, including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further enhanced by the search space. An automatic re-parameterization enhancement strategy is designed to effectively explore this search space based on neural architecture search (NAS), which can search an excellent re-parameterization architecture. Then, we solved the optimization problem caused by using some re-parameterization operations to enhance ResNet-style network. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s11063-023-11184-6</doi><tpages>24</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1370-4621
ispartof Neural processing letters, 2023-12, Vol.55 (7), p.8903-8926
issn 1370-4621
1573-773X
language eng
recordid cdi_proquest_journals_2918354193
source SpringerLink Journals - AutoHoldings; ProQuest Central
subjects Accuracy
Artificial Intelligence
Complex Systems
Computational Intelligence
Computer Science
Datasets
Genetic algorithms
Optimization
Parameterization
Performance enhancement
Searching
title Efficient Re-parameterization Operations Search for Easy-to-Deploy Network Based on Directional Evolutionary Strategy
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-28T23%3A47%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Efficient%20Re-parameterization%20Operations%20Search%20for%20Easy-to-Deploy%20Network%20Based%20on%20Directional%20Evolutionary%20Strategy&rft.jtitle=Neural%20processing%20letters&rft.au=Yu,%20Xinyi&rft.date=2023-12-01&rft.volume=55&rft.issue=7&rft.spage=8903&rft.epage=8926&rft.pages=8903-8926&rft.issn=1370-4621&rft.eissn=1573-773X&rft_id=info:doi/10.1007/s11063-023-11184-6&rft_dat=%3Cproquest_cross%3E2918354193%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2918354193&rft_id=info:pmid/&rfr_iscdi=true