Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search

Neural Architecture Search (NAS) aims to find efficient models for multiple tasks. Beyond seeking solutions for a single task, there are surging interests in transferring network design knowledge across multiple tasks. In this line of research, effectively modeling task correlations is vital yet hig...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Huang, Minbin, Huang, Zhijian, Li, Changlin, Chen, Xin, Xu, Hang, Li, Zhenguo, Liang, Xiaodan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Huang, Minbin
Huang, Zhijian
Li, Changlin
Chen, Xin
Xu, Hang
Li, Zhenguo
Liang, Xiaodan
description Neural Architecture Search (NAS) aims to find efficient models for multiple tasks. Beyond seeking solutions for a single task, there are surging interests in transferring network design knowledge across multiple tasks. In this line of research, effectively modeling task correlations is vital yet highly neglected. Therefore, we propose \textbf{Arch-Graph}, a transferable NAS method that predicts task-specific optimal architectures with respect to given task embeddings. It leverages correlations across multiple tasks by using their embeddings as a part of the predictor's input for fast adaptation. We also formulate NAS as an architecture relation graph prediction problem, with the relational graph constructed by treating candidate architectures as nodes and their pairwise relations as edges. To enforce some basic properties such as acyclicity in the relational graph, we add additional constraints to the optimization process, converting NAS into the problem of finding a Maximal Weighted Acyclic Subgraph (MWAS). Our algorithm then strives to eliminate cycles and only establish edges in the graph if the rank results can be trusted. Through MWAS, Arch-Graph can effectively rank candidate models for each task with only a small budget to finetune the predictor. With extensive experiments on TransNAS-Bench-101, we show Arch-Graph's transferability and high sample efficiency across numerous tasks, beating many NAS methods designed for both single-task and multi-task search. It is able to find top 0.16\% and 0.29\% architectures on average on two search spaces under the budget of only 50 models.
doi_str_mv 10.48550/arxiv.2204.05941
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2204_05941</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2204_05941</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-7cbc819ab23ce5dc737efaa3ceb7bf5a292d6f0ef2b16532dfb13020a99800bc3</originalsourceid><addsrcrecordid>eNpVj81Kw0AUhWfjQqoP4KrzAonzk8mPu1C0CqWKzT7cublDB2NbblKxb29a3bg4HM6Bc-AT4k6rNCudU_fA3_ErNUZlqXJVpq9FVzNukyXDYfsgazxhH1GeuzgSjkcm-U49jHG_k29MXcRxzzJMamD4SBqG3RCIwfck13Rk6P-PNwRTvBFXAfqBbv98Jpqnx2bxnKxely-LepVAXuikQI-lrsAbi-Q6LGxBAWAKvvDBgalMlwdFwXidO2u64LVVRkFVlUp5tDMx_729YLYHjp_Ap_aM215w7Q-WLlHk</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search</title><source>arXiv.org</source><creator>Huang, Minbin ; Huang, Zhijian ; Li, Changlin ; Chen, Xin ; Xu, Hang ; Li, Zhenguo ; Liang, Xiaodan</creator><creatorcontrib>Huang, Minbin ; Huang, Zhijian ; Li, Changlin ; Chen, Xin ; Xu, Hang ; Li, Zhenguo ; Liang, Xiaodan</creatorcontrib><description>Neural Architecture Search (NAS) aims to find efficient models for multiple tasks. Beyond seeking solutions for a single task, there are surging interests in transferring network design knowledge across multiple tasks. In this line of research, effectively modeling task correlations is vital yet highly neglected. Therefore, we propose \textbf{Arch-Graph}, a transferable NAS method that predicts task-specific optimal architectures with respect to given task embeddings. It leverages correlations across multiple tasks by using their embeddings as a part of the predictor's input for fast adaptation. We also formulate NAS as an architecture relation graph prediction problem, with the relational graph constructed by treating candidate architectures as nodes and their pairwise relations as edges. To enforce some basic properties such as acyclicity in the relational graph, we add additional constraints to the optimization process, converting NAS into the problem of finding a Maximal Weighted Acyclic Subgraph (MWAS). Our algorithm then strives to eliminate cycles and only establish edges in the graph if the rank results can be trusted. Through MWAS, Arch-Graph can effectively rank candidate models for each task with only a small budget to finetune the predictor. With extensive experiments on TransNAS-Bench-101, we show Arch-Graph's transferability and high sample efficiency across numerous tasks, beating many NAS methods designed for both single-task and multi-task search. It is able to find top 0.16\% and 0.29\% architectures on average on two search spaces under the budget of only 50 models.</description><identifier>DOI: 10.48550/arxiv.2204.05941</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning</subject><creationdate>2022-04</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2204.05941$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2204.05941$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Huang, Minbin</creatorcontrib><creatorcontrib>Huang, Zhijian</creatorcontrib><creatorcontrib>Li, Changlin</creatorcontrib><creatorcontrib>Chen, Xin</creatorcontrib><creatorcontrib>Xu, Hang</creatorcontrib><creatorcontrib>Li, Zhenguo</creatorcontrib><creatorcontrib>Liang, Xiaodan</creatorcontrib><title>Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search</title><description>Neural Architecture Search (NAS) aims to find efficient models for multiple tasks. Beyond seeking solutions for a single task, there are surging interests in transferring network design knowledge across multiple tasks. In this line of research, effectively modeling task correlations is vital yet highly neglected. Therefore, we propose \textbf{Arch-Graph}, a transferable NAS method that predicts task-specific optimal architectures with respect to given task embeddings. It leverages correlations across multiple tasks by using their embeddings as a part of the predictor's input for fast adaptation. We also formulate NAS as an architecture relation graph prediction problem, with the relational graph constructed by treating candidate architectures as nodes and their pairwise relations as edges. To enforce some basic properties such as acyclicity in the relational graph, we add additional constraints to the optimization process, converting NAS into the problem of finding a Maximal Weighted Acyclic Subgraph (MWAS). Our algorithm then strives to eliminate cycles and only establish edges in the graph if the rank results can be trusted. Through MWAS, Arch-Graph can effectively rank candidate models for each task with only a small budget to finetune the predictor. With extensive experiments on TransNAS-Bench-101, we show Arch-Graph's transferability and high sample efficiency across numerous tasks, beating many NAS methods designed for both single-task and multi-task search. It is able to find top 0.16\% and 0.29\% architectures on average on two search spaces under the budget of only 50 models.</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpVj81Kw0AUhWfjQqoP4KrzAonzk8mPu1C0CqWKzT7cublDB2NbblKxb29a3bg4HM6Bc-AT4k6rNCudU_fA3_ErNUZlqXJVpq9FVzNukyXDYfsgazxhH1GeuzgSjkcm-U49jHG_k29MXcRxzzJMamD4SBqG3RCIwfck13Rk6P-PNwRTvBFXAfqBbv98Jpqnx2bxnKxely-LepVAXuikQI-lrsAbi-Q6LGxBAWAKvvDBgalMlwdFwXidO2u64LVVRkFVlUp5tDMx_729YLYHjp_Ap_aM215w7Q-WLlHk</recordid><startdate>20220412</startdate><enddate>20220412</enddate><creator>Huang, Minbin</creator><creator>Huang, Zhijian</creator><creator>Li, Changlin</creator><creator>Chen, Xin</creator><creator>Xu, Hang</creator><creator>Li, Zhenguo</creator><creator>Liang, Xiaodan</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220412</creationdate><title>Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search</title><author>Huang, Minbin ; Huang, Zhijian ; Li, Changlin ; Chen, Xin ; Xu, Hang ; Li, Zhenguo ; Liang, Xiaodan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-7cbc819ab23ce5dc737efaa3ceb7bf5a292d6f0ef2b16532dfb13020a99800bc3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Huang, Minbin</creatorcontrib><creatorcontrib>Huang, Zhijian</creatorcontrib><creatorcontrib>Li, Changlin</creatorcontrib><creatorcontrib>Chen, Xin</creatorcontrib><creatorcontrib>Xu, Hang</creatorcontrib><creatorcontrib>Li, Zhenguo</creatorcontrib><creatorcontrib>Liang, Xiaodan</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Huang, Minbin</au><au>Huang, Zhijian</au><au>Li, Changlin</au><au>Chen, Xin</au><au>Xu, Hang</au><au>Li, Zhenguo</au><au>Liang, Xiaodan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search</atitle><date>2022-04-12</date><risdate>2022</risdate><abstract>Neural Architecture Search (NAS) aims to find efficient models for multiple tasks. Beyond seeking solutions for a single task, there are surging interests in transferring network design knowledge across multiple tasks. In this line of research, effectively modeling task correlations is vital yet highly neglected. Therefore, we propose \textbf{Arch-Graph}, a transferable NAS method that predicts task-specific optimal architectures with respect to given task embeddings. It leverages correlations across multiple tasks by using their embeddings as a part of the predictor's input for fast adaptation. We also formulate NAS as an architecture relation graph prediction problem, with the relational graph constructed by treating candidate architectures as nodes and their pairwise relations as edges. To enforce some basic properties such as acyclicity in the relational graph, we add additional constraints to the optimization process, converting NAS into the problem of finding a Maximal Weighted Acyclic Subgraph (MWAS). Our algorithm then strives to eliminate cycles and only establish edges in the graph if the rank results can be trusted. Through MWAS, Arch-Graph can effectively rank candidate models for each task with only a small budget to finetune the predictor. With extensive experiments on TransNAS-Bench-101, we show Arch-Graph's transferability and high sample efficiency across numerous tasks, beating many NAS methods designed for both single-task and multi-task search. It is able to find top 0.16\% and 0.29\% architectures on average on two search spaces under the budget of only 50 models.</abstract><doi>10.48550/arxiv.2204.05941</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2204.05941
ispartof
issn
language eng
recordid cdi_arxiv_primary_2204_05941
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
Computer Science - Learning
title Arch-Graph: Acyclic Architecture Relation Predictor for Task-Transferable Neural Architecture Search
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-08T11%3A08%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Arch-Graph:%20Acyclic%20Architecture%20Relation%20Predictor%20for%20Task-Transferable%20Neural%20Architecture%20Search&rft.au=Huang,%20Minbin&rft.date=2022-04-12&rft_id=info:doi/10.48550/arxiv.2204.05941&rft_dat=%3Carxiv_GOX%3E2204_05941%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true