Toward gradient bandit-based selection of candidate architectures in AutoGAN
The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performanc...
Gespeichert in:
Veröffentlicht in: | Soft computing (Berlin, Germany) Germany), 2021-03, Vol.25 (6), p.4367-4378 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 4378 |
---|---|
container_issue | 6 |
container_start_page | 4367 |
container_title | Soft computing (Berlin, Germany) |
container_volume | 25 |
creator | Fan, Yi Zhou, Guoqiang Shen, Jun Dai, Guilan |
description | The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying. |
doi_str_mv | 10.1007/s00500-020-05446-x |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3121499721</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3121499721</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-c808038a3a2fc6ac92ec6a4d564eb49a6052da4b6fe15f9cc9dfde9653d309323</originalsourceid><addsrcrecordid>eNp9kE9LAzEQxYMoWKtfwFPAc3TyZ7PNsRStQtFLPYdskq1b6qYmWazf3tgVvHkY3sB7bwZ-CF1TuKUA9V0CqAAIsDKVEJIcTtCECs5JLWp1etwZqaXg5-gipS0Ao3XFJ2i1Dp8mOryJxnW-z7gxvesyaUzyDie_8zZ3ocehxfbHcSZ7bKJ963JxhugT7no8H3JYzp8v0Vlrdslf_eoUvT7crxePZPWyfFrMV8RyqjKxM5gBnxluWGulsYr5IsJVUvhGKCOhYs6IRraeVq2yVrnWeSUr7jgozvgU3Yx39zF8DD5lvQ1D7MtLzSmjQqma0ZJiY8rGkFL0rd7H7t3EL01B_1DTIzVdqOkjNX0oJT6WUgn3Gx__Tv_T-gaNUnBx</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3121499721</pqid></control><display><type>article</type><title>Toward gradient bandit-based selection of candidate architectures in AutoGAN</title><source>ProQuest Central UK/Ireland</source><source>SpringerLink Journals - AutoHoldings</source><source>ProQuest Central</source><creator>Fan, Yi ; Zhou, Guoqiang ; Shen, Jun ; Dai, Guilan</creator><creatorcontrib>Fan, Yi ; Zhou, Guoqiang ; Shen, Jun ; Dai, Guilan</creatorcontrib><description>The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.</description><identifier>ISSN: 1432-7643</identifier><identifier>EISSN: 1433-7479</identifier><identifier>DOI: 10.1007/s00500-020-05446-x</identifier><language>eng</language><publisher>Berlin/Heidelberg: Springer Berlin Heidelberg</publisher><subject>Algorithms ; Artificial Intelligence ; Computational Intelligence ; Control ; Engineering ; Generative adversarial networks ; Machine learning ; Mathematical Logic and Foundations ; Mechatronics ; Methodologies and Application ; Methods ; Neural networks ; Robotics ; Searching</subject><ispartof>Soft computing (Berlin, Germany), 2021-03, Vol.25 (6), p.4367-4378</ispartof><rights>Springer-Verlag GmbH Germany, part of Springer Nature 2021</rights><rights>Springer-Verlag GmbH Germany, part of Springer Nature 2021.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-c808038a3a2fc6ac92ec6a4d564eb49a6052da4b6fe15f9cc9dfde9653d309323</citedby><cites>FETCH-LOGICAL-c319t-c808038a3a2fc6ac92ec6a4d564eb49a6052da4b6fe15f9cc9dfde9653d309323</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s00500-020-05446-x$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/3121499721?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,778,782,21371,27907,27908,33727,41471,42540,43788,51302,64366,64370,72220</link.rule.ids></links><search><creatorcontrib>Fan, Yi</creatorcontrib><creatorcontrib>Zhou, Guoqiang</creatorcontrib><creatorcontrib>Shen, Jun</creatorcontrib><creatorcontrib>Dai, Guilan</creatorcontrib><title>Toward gradient bandit-based selection of candidate architectures in AutoGAN</title><title>Soft computing (Berlin, Germany)</title><addtitle>Soft Comput</addtitle><description>The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Computational Intelligence</subject><subject>Control</subject><subject>Engineering</subject><subject>Generative adversarial networks</subject><subject>Machine learning</subject><subject>Mathematical Logic and Foundations</subject><subject>Mechatronics</subject><subject>Methodologies and Application</subject><subject>Methods</subject><subject>Neural networks</subject><subject>Robotics</subject><subject>Searching</subject><issn>1432-7643</issn><issn>1433-7479</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE9LAzEQxYMoWKtfwFPAc3TyZ7PNsRStQtFLPYdskq1b6qYmWazf3tgVvHkY3sB7bwZ-CF1TuKUA9V0CqAAIsDKVEJIcTtCECs5JLWp1etwZqaXg5-gipS0Ao3XFJ2i1Dp8mOryJxnW-z7gxvesyaUzyDie_8zZ3ocehxfbHcSZ7bKJ963JxhugT7no8H3JYzp8v0Vlrdslf_eoUvT7crxePZPWyfFrMV8RyqjKxM5gBnxluWGulsYr5IsJVUvhGKCOhYs6IRraeVq2yVrnWeSUr7jgozvgU3Yx39zF8DD5lvQ1D7MtLzSmjQqma0ZJiY8rGkFL0rd7H7t3EL01B_1DTIzVdqOkjNX0oJT6WUgn3Gx__Tv_T-gaNUnBx</recordid><startdate>20210301</startdate><enddate>20210301</enddate><creator>Fan, Yi</creator><creator>Zhou, Guoqiang</creator><creator>Shen, Jun</creator><creator>Dai, Guilan</creator><general>Springer Berlin Heidelberg</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope></search><sort><creationdate>20210301</creationdate><title>Toward gradient bandit-based selection of candidate architectures in AutoGAN</title><author>Fan, Yi ; Zhou, Guoqiang ; Shen, Jun ; Dai, Guilan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-c808038a3a2fc6ac92ec6a4d564eb49a6052da4b6fe15f9cc9dfde9653d309323</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Computational Intelligence</topic><topic>Control</topic><topic>Engineering</topic><topic>Generative adversarial networks</topic><topic>Machine learning</topic><topic>Mathematical Logic and Foundations</topic><topic>Mechatronics</topic><topic>Methodologies and Application</topic><topic>Methods</topic><topic>Neural networks</topic><topic>Robotics</topic><topic>Searching</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Fan, Yi</creatorcontrib><creatorcontrib>Zhou, Guoqiang</creatorcontrib><creatorcontrib>Shen, Jun</creatorcontrib><creatorcontrib>Dai, Guilan</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Soft computing (Berlin, Germany)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Fan, Yi</au><au>Zhou, Guoqiang</au><au>Shen, Jun</au><au>Dai, Guilan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Toward gradient bandit-based selection of candidate architectures in AutoGAN</atitle><jtitle>Soft computing (Berlin, Germany)</jtitle><stitle>Soft Comput</stitle><date>2021-03-01</date><risdate>2021</risdate><volume>25</volume><issue>6</issue><spage>4367</spage><epage>4378</epage><pages>4367-4378</pages><issn>1432-7643</issn><eissn>1433-7479</eissn><abstract>The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.</abstract><cop>Berlin/Heidelberg</cop><pub>Springer Berlin Heidelberg</pub><doi>10.1007/s00500-020-05446-x</doi><tpages>12</tpages></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1432-7643 |
ispartof | Soft computing (Berlin, Germany), 2021-03, Vol.25 (6), p.4367-4378 |
issn | 1432-7643 1433-7479 |
language | eng |
recordid | cdi_proquest_journals_3121499721 |
source | ProQuest Central UK/Ireland; SpringerLink Journals - AutoHoldings; ProQuest Central |
subjects | Algorithms Artificial Intelligence Computational Intelligence Control Engineering Generative adversarial networks Machine learning Mathematical Logic and Foundations Mechatronics Methodologies and Application Methods Neural networks Robotics Searching |
title | Toward gradient bandit-based selection of candidate architectures in AutoGAN |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T01%3A17%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Toward%20gradient%20bandit-based%20selection%20of%20candidate%20architectures%20in%20AutoGAN&rft.jtitle=Soft%20computing%20(Berlin,%20Germany)&rft.au=Fan,%20Yi&rft.date=2021-03-01&rft.volume=25&rft.issue=6&rft.spage=4367&rft.epage=4378&rft.pages=4367-4378&rft.issn=1432-7643&rft.eissn=1433-7479&rft_id=info:doi/10.1007/s00500-020-05446-x&rft_dat=%3Cproquest_cross%3E3121499721%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3121499721&rft_id=info:pmid/&rfr_iscdi=true |