Auto-Meta: Automated Gradient Based Meta Learner Search

Fully automating machine learning pipelines is one of the key challenges of current artificial intelligence research, since practical machine learning often requires costly and time-consuming human-powered processes such as model design, algorithm development, and hyperparameter tuning. In this pape...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Kim, Jaehong, Lee, Sangyeul, Kim, Sungwan, Cha, Moonsu, Lee, Jung Kwon, Choi, Youngduck, Choi, Yongseok, Cho, Dong-Yeon, Kim, Jiwon
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Kim, Jaehong
Lee, Sangyeul
Kim, Sungwan
Cha, Moonsu
Lee, Jung Kwon
Choi, Youngduck
Choi, Yongseok
Cho, Dong-Yeon
Kim, Jiwon
description Fully automating machine learning pipelines is one of the key challenges of current artificial intelligence research, since practical machine learning often requires costly and time-consuming human-powered processes such as model design, algorithm development, and hyperparameter tuning. In this paper, we verify that automated architecture search synergizes with the effect of gradient-based meta learning. We adopt the progressive neural architecture search \cite{liu:pnas_google:DBLP:journals/corr/abs-1712-00559} to find optimal architectures for meta-learners. The gradient based meta-learner whose architecture was automatically found achieved state-of-the-art results on the 5-shot 5-way Mini-ImageNet classification problem with $74.65\%$ accuracy, which is $11.54\%$ improvement over the result obtained by the first gradient-based meta-learner called MAML \cite{finn:maml:DBLP:conf/icml/FinnAL17}. To our best knowledge, this work is the first successful neural architecture search implementation in the context of meta learning.
doi_str_mv 10.48550/arxiv.1806.06927
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_1806_06927</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1806_06927</sourcerecordid><originalsourceid>FETCH-LOGICAL-a677-b582c4eeeb1e15ce6e5d442a975dfee61032b0af323a5b1dd2ccd69371e8a0b43</originalsourceid><addsrcrecordid>eNotj01uwjAUhL1hUYUeoCt8gQT_O2EXUKFIQSxgHz3bLyISBGTSCm4Poay-GY00miHki7NM5VqzKcRb-5fxnJmMmULYD2LL3_6cbrCHGR3kCXoMdBUhtNj1dA7Xpx1iWiHEDiPdPekPYzJq4HjFzzcTsl9-7xc_abVdrRdllYKxNnU6F14houPItUeDOigloLA6NIiGMykcg0YKCdrxEIT3wRTScsyBOSUTMvmvfS2vL7E9QbzXw4P69UA-AOEOQEg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Auto-Meta: Automated Gradient Based Meta Learner Search</title><source>arXiv.org</source><creator>Kim, Jaehong ; Lee, Sangyeul ; Kim, Sungwan ; Cha, Moonsu ; Lee, Jung Kwon ; Choi, Youngduck ; Choi, Yongseok ; Cho, Dong-Yeon ; Kim, Jiwon</creator><creatorcontrib>Kim, Jaehong ; Lee, Sangyeul ; Kim, Sungwan ; Cha, Moonsu ; Lee, Jung Kwon ; Choi, Youngduck ; Choi, Yongseok ; Cho, Dong-Yeon ; Kim, Jiwon</creatorcontrib><description>Fully automating machine learning pipelines is one of the key challenges of current artificial intelligence research, since practical machine learning often requires costly and time-consuming human-powered processes such as model design, algorithm development, and hyperparameter tuning. In this paper, we verify that automated architecture search synergizes with the effect of gradient-based meta learning. We adopt the progressive neural architecture search \cite{liu:pnas_google:DBLP:journals/corr/abs-1712-00559} to find optimal architectures for meta-learners. The gradient based meta-learner whose architecture was automatically found achieved state-of-the-art results on the 5-shot 5-way Mini-ImageNet classification problem with $74.65\%$ accuracy, which is $11.54\%$ improvement over the result obtained by the first gradient-based meta-learner called MAML \cite{finn:maml:DBLP:conf/icml/FinnAL17}. To our best knowledge, this work is the first successful neural architecture search implementation in the context of meta learning.</description><identifier>DOI: 10.48550/arxiv.1806.06927</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2018-06</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,781,886</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/1806.06927$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.1806.06927$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Kim, Jaehong</creatorcontrib><creatorcontrib>Lee, Sangyeul</creatorcontrib><creatorcontrib>Kim, Sungwan</creatorcontrib><creatorcontrib>Cha, Moonsu</creatorcontrib><creatorcontrib>Lee, Jung Kwon</creatorcontrib><creatorcontrib>Choi, Youngduck</creatorcontrib><creatorcontrib>Choi, Yongseok</creatorcontrib><creatorcontrib>Cho, Dong-Yeon</creatorcontrib><creatorcontrib>Kim, Jiwon</creatorcontrib><title>Auto-Meta: Automated Gradient Based Meta Learner Search</title><description>Fully automating machine learning pipelines is one of the key challenges of current artificial intelligence research, since practical machine learning often requires costly and time-consuming human-powered processes such as model design, algorithm development, and hyperparameter tuning. In this paper, we verify that automated architecture search synergizes with the effect of gradient-based meta learning. We adopt the progressive neural architecture search \cite{liu:pnas_google:DBLP:journals/corr/abs-1712-00559} to find optimal architectures for meta-learners. The gradient based meta-learner whose architecture was automatically found achieved state-of-the-art results on the 5-shot 5-way Mini-ImageNet classification problem with $74.65\%$ accuracy, which is $11.54\%$ improvement over the result obtained by the first gradient-based meta-learner called MAML \cite{finn:maml:DBLP:conf/icml/FinnAL17}. To our best knowledge, this work is the first successful neural architecture search implementation in the context of meta learning.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2018</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj01uwjAUhL1hUYUeoCt8gQT_O2EXUKFIQSxgHz3bLyISBGTSCm4Poay-GY00miHki7NM5VqzKcRb-5fxnJmMmULYD2LL3_6cbrCHGR3kCXoMdBUhtNj1dA7Xpx1iWiHEDiPdPekPYzJq4HjFzzcTsl9-7xc_abVdrRdllYKxNnU6F14houPItUeDOigloLA6NIiGMykcg0YKCdrxEIT3wRTScsyBOSUTMvmvfS2vL7E9QbzXw4P69UA-AOEOQEg</recordid><startdate>20180611</startdate><enddate>20180611</enddate><creator>Kim, Jaehong</creator><creator>Lee, Sangyeul</creator><creator>Kim, Sungwan</creator><creator>Cha, Moonsu</creator><creator>Lee, Jung Kwon</creator><creator>Choi, Youngduck</creator><creator>Choi, Yongseok</creator><creator>Cho, Dong-Yeon</creator><creator>Kim, Jiwon</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20180611</creationdate><title>Auto-Meta: Automated Gradient Based Meta Learner Search</title><author>Kim, Jaehong ; Lee, Sangyeul ; Kim, Sungwan ; Cha, Moonsu ; Lee, Jung Kwon ; Choi, Youngduck ; Choi, Yongseok ; Cho, Dong-Yeon ; Kim, Jiwon</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a677-b582c4eeeb1e15ce6e5d442a975dfee61032b0af323a5b1dd2ccd69371e8a0b43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2018</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Kim, Jaehong</creatorcontrib><creatorcontrib>Lee, Sangyeul</creatorcontrib><creatorcontrib>Kim, Sungwan</creatorcontrib><creatorcontrib>Cha, Moonsu</creatorcontrib><creatorcontrib>Lee, Jung Kwon</creatorcontrib><creatorcontrib>Choi, Youngduck</creatorcontrib><creatorcontrib>Choi, Yongseok</creatorcontrib><creatorcontrib>Cho, Dong-Yeon</creatorcontrib><creatorcontrib>Kim, Jiwon</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kim, Jaehong</au><au>Lee, Sangyeul</au><au>Kim, Sungwan</au><au>Cha, Moonsu</au><au>Lee, Jung Kwon</au><au>Choi, Youngduck</au><au>Choi, Yongseok</au><au>Cho, Dong-Yeon</au><au>Kim, Jiwon</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Auto-Meta: Automated Gradient Based Meta Learner Search</atitle><date>2018-06-11</date><risdate>2018</risdate><abstract>Fully automating machine learning pipelines is one of the key challenges of current artificial intelligence research, since practical machine learning often requires costly and time-consuming human-powered processes such as model design, algorithm development, and hyperparameter tuning. In this paper, we verify that automated architecture search synergizes with the effect of gradient-based meta learning. We adopt the progressive neural architecture search \cite{liu:pnas_google:DBLP:journals/corr/abs-1712-00559} to find optimal architectures for meta-learners. The gradient based meta-learner whose architecture was automatically found achieved state-of-the-art results on the 5-shot 5-way Mini-ImageNet classification problem with $74.65\%$ accuracy, which is $11.54\%$ improvement over the result obtained by the first gradient-based meta-learner called MAML \cite{finn:maml:DBLP:conf/icml/FinnAL17}. To our best knowledge, this work is the first successful neural architecture search implementation in the context of meta learning.</abstract><doi>10.48550/arxiv.1806.06927</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.1806.06927
ispartof
issn
language eng
recordid cdi_arxiv_primary_1806_06927
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Computer Vision and Pattern Recognition
Computer Science - Learning
Statistics - Machine Learning
title Auto-Meta: Automated Gradient Based Meta Learner Search
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-16T10%3A24%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Auto-Meta:%20Automated%20Gradient%20Based%20Meta%20Learner%20Search&rft.au=Kim,%20Jaehong&rft.date=2018-06-11&rft_id=info:doi/10.48550/arxiv.1806.06927&rft_dat=%3Carxiv_GOX%3E1806_06927%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true