Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization

Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied exten...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-05
Hauptverfasser: Guerrero-Viu, Julia, Hauns, Sven, Izquierdo, Sergio, Miotto, Guilherme, Schrodi, Simon, Biedenkapp, Andre, Elsken, Thomas, Deng, Difan, Lindauer, Marius, Hutter, Frank
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Guerrero-Viu, Julia
Hauns, Sven
Izquierdo, Sergio
Miotto, Guilherme
Schrodi, Simon
Biedenkapp, Andre
Elsken, Thomas
Deng, Difan
Lindauer, Marius
Hutter, Frank
description Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2521814532</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2521814532</sourcerecordid><originalsourceid>FETCH-proquest_journals_25218145323</originalsourceid><addsrcrecordid>eNqNikEKwjAQAIMgKOofFjwX2qTVXlUUEdSD3suqW01pk7pJBH29PfgAT8Mw0xNDqVQS5amUAzFxrorjWM7mMsvUUNAS72BLWKKjWhtyUFqGfai9juyloqvXL4Kd1cbDgQJjDQu-PrTvSmCCE2GngOYG23dL3CJjQ54Yjq3Xjf6g19aMRb_E2tHkx5GYbtbn1TZq2T4DOV9UNrDpUiEzmeRJmimp_ru-WXJG4g</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2521814532</pqid></control><display><type>article</type><title>Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization</title><source>Free E- Journals</source><creator>Guerrero-Viu, Julia ; Hauns, Sven ; Izquierdo, Sergio ; Miotto, Guilherme ; Schrodi, Simon ; Biedenkapp, Andre ; Elsken, Thomas ; Deng, Difan ; Lindauer, Marius ; Hutter, Frank</creator><creatorcontrib>Guerrero-Viu, Julia ; Hauns, Sven ; Izquierdo, Sergio ; Miotto, Guilherme ; Schrodi, Simon ; Biedenkapp, Andre ; Elsken, Thomas ; Deng, Difan ; Lindauer, Marius ; Hutter, Frank</creatorcontrib><description>Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Artificial neural networks ; Machine learning ; Multiple objective analysis ; Neural networks ; Optimization</subject><ispartof>arXiv.org, 2021-05</ispartof><rights>2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>780,784</link.rule.ids></links><search><creatorcontrib>Guerrero-Viu, Julia</creatorcontrib><creatorcontrib>Hauns, Sven</creatorcontrib><creatorcontrib>Izquierdo, Sergio</creatorcontrib><creatorcontrib>Miotto, Guilherme</creatorcontrib><creatorcontrib>Schrodi, Simon</creatorcontrib><creatorcontrib>Biedenkapp, Andre</creatorcontrib><creatorcontrib>Elsken, Thomas</creatorcontrib><creatorcontrib>Deng, Difan</creatorcontrib><creatorcontrib>Lindauer, Marius</creatorcontrib><creatorcontrib>Hutter, Frank</creatorcontrib><title>Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization</title><title>arXiv.org</title><description>Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.</description><subject>Artificial neural networks</subject><subject>Machine learning</subject><subject>Multiple objective analysis</subject><subject>Neural networks</subject><subject>Optimization</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><recordid>eNqNikEKwjAQAIMgKOofFjwX2qTVXlUUEdSD3suqW01pk7pJBH29PfgAT8Mw0xNDqVQS5amUAzFxrorjWM7mMsvUUNAS72BLWKKjWhtyUFqGfai9juyloqvXL4Kd1cbDgQJjDQu-PrTvSmCCE2GngOYG23dL3CJjQ54Yjq3Xjf6g19aMRb_E2tHkx5GYbtbn1TZq2T4DOV9UNrDpUiEzmeRJmimp_ru-WXJG4g</recordid><startdate>20210503</startdate><enddate>20210503</enddate><creator>Guerrero-Viu, Julia</creator><creator>Hauns, Sven</creator><creator>Izquierdo, Sergio</creator><creator>Miotto, Guilherme</creator><creator>Schrodi, Simon</creator><creator>Biedenkapp, Andre</creator><creator>Elsken, Thomas</creator><creator>Deng, Difan</creator><creator>Lindauer, Marius</creator><creator>Hutter, Frank</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20210503</creationdate><title>Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization</title><author>Guerrero-Viu, Julia ; Hauns, Sven ; Izquierdo, Sergio ; Miotto, Guilherme ; Schrodi, Simon ; Biedenkapp, Andre ; Elsken, Thomas ; Deng, Difan ; Lindauer, Marius ; Hutter, Frank</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_25218145323</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial neural networks</topic><topic>Machine learning</topic><topic>Multiple objective analysis</topic><topic>Neural networks</topic><topic>Optimization</topic><toplevel>online_resources</toplevel><creatorcontrib>Guerrero-Viu, Julia</creatorcontrib><creatorcontrib>Hauns, Sven</creatorcontrib><creatorcontrib>Izquierdo, Sergio</creatorcontrib><creatorcontrib>Miotto, Guilherme</creatorcontrib><creatorcontrib>Schrodi, Simon</creatorcontrib><creatorcontrib>Biedenkapp, Andre</creatorcontrib><creatorcontrib>Elsken, Thomas</creatorcontrib><creatorcontrib>Deng, Difan</creatorcontrib><creatorcontrib>Lindauer, Marius</creatorcontrib><creatorcontrib>Hutter, Frank</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Guerrero-Viu, Julia</au><au>Hauns, Sven</au><au>Izquierdo, Sergio</au><au>Miotto, Guilherme</au><au>Schrodi, Simon</au><au>Biedenkapp, Andre</au><au>Elsken, Thomas</au><au>Deng, Difan</au><au>Lindauer, Marius</au><au>Hutter, Frank</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization</atitle><jtitle>arXiv.org</jtitle><date>2021-05-03</date><risdate>2021</risdate><eissn>2331-8422</eissn><abstract>Neural architecture search (NAS) and hyperparameter optimization (HPO) make deep learning accessible to non-experts by automatically finding the architecture of the deep neural network to use and tuning the hyperparameters of the used training pipeline. While both NAS and HPO have been studied extensively in recent years, NAS methods typically assume fixed hyperparameters and vice versa - there exists little work on joint NAS + HPO. Furthermore, NAS has recently often been framed as a multi-objective optimization problem, in order to take, e.g., resource requirements into account. In this paper, we propose a set of methods that extend current approaches to jointly optimize neural architectures and hyperparameters with respect to multiple objectives. We hope that these methods will serve as simple baselines for future research on multi-objective joint NAS + HPO. To facilitate this, all our code is available at https://github.com/automl/multi-obj-baselines.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2021-05
issn 2331-8422
language eng
recordid cdi_proquest_journals_2521814532
source Free E- Journals
subjects Artificial neural networks
Machine learning
Multiple objective analysis
Neural networks
Optimization
title Bag of Baselines for Multi-objective Joint Neural Architecture Search and Hyperparameter Optimization
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-26T20%3A28%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Bag%20of%20Baselines%20for%20Multi-objective%20Joint%20Neural%20Architecture%20Search%20and%20Hyperparameter%20Optimization&rft.jtitle=arXiv.org&rft.au=Guerrero-Viu,%20Julia&rft.date=2021-05-03&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2521814532%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2521814532&rft_id=info:pmid/&rfr_iscdi=true