The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation

In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Lobanov, Aleksandr, Bashirov, Nail, Gasnikov, Alexander
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Lobanov, Aleksandr
Bashirov, Nail
Gasnikov, Alexander
description In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, possibly with some stochastic noise. Assuming convex, and higher-order of smoothness of the objective function, this paper provides a zero-order accelerated stochastic gradient descent (ZO-AccSGD) method for solving this problem, which exploits the higher-order of smoothness information via kernel approximation. As theoretical results, we show that the ZO-AccSGD algorithm proposed in this paper improves the convergence results of state-of-the-art (SOTA) algorithms, namely the estimate of iteration complexity. In addition, our theoretical analysis provides an estimate of the maximum allowable noise level at which the desired accuracy can be achieved. Validation of our theoretical results is demonstrated both on the model function and on functions of interest in the field of machine learning. We also provide a discussion in which we explain the results obtained and the superiority of the proposed algorithm over SOTA algorithms for solving the original problem.
doi_str_mv 10.48550/arxiv.2310.02371
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2310_02371</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2310_02371</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-e4ee49f901c77b48cc17eb0053cf47eea4615e0c0fb18e6d9b02936d533433f13</originalsourceid><addsrcrecordid>eNotj71OwzAURr0woMIDMOEXSLFznThhSyv-1KIgkYklcuwbxcKpI9eqAk9PCUyf9A3n6BByw9laFFnG7lSY7WmdwvlgKUh-SbAZkG6c0p_Jxs-0nqId7beK1h_oW_Cdw_GefmDwSR0MBlppjQ6Dimjoe_R6UMdoNX3FOHhDT1bRHYYDOlpNU_CzHRfUFbnolTvi9f-uSPP40Gyfk3399LKt9onKJU9QIIqyLxnXUnai0JpL7BjLQPdCIiqR8wyZZn3HC8xN2bG0hNxkAAKg57Ait3_YpbOdwlkfvtrf3nbphR-FhVEN</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation</title><source>arXiv.org</source><creator>Lobanov, Aleksandr ; Bashirov, Nail ; Gasnikov, Alexander</creator><creatorcontrib>Lobanov, Aleksandr ; Bashirov, Nail ; Gasnikov, Alexander</creatorcontrib><description>In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, possibly with some stochastic noise. Assuming convex, and higher-order of smoothness of the objective function, this paper provides a zero-order accelerated stochastic gradient descent (ZO-AccSGD) method for solving this problem, which exploits the higher-order of smoothness information via kernel approximation. As theoretical results, we show that the ZO-AccSGD algorithm proposed in this paper improves the convergence results of state-of-the-art (SOTA) algorithms, namely the estimate of iteration complexity. In addition, our theoretical analysis provides an estimate of the maximum allowable noise level at which the desired accuracy can be achieved. Validation of our theoretical results is demonstrated both on the model function and on functions of interest in the field of machine learning. We also provide a discussion in which we explain the results obtained and the superiority of the proposed algorithm over SOTA algorithms for solving the original problem.</description><identifier>DOI: 10.48550/arxiv.2310.02371</identifier><language>eng</language><subject>Mathematics - Optimization and Control</subject><creationdate>2023-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,777,882</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2310.02371$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2310.02371$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Lobanov, Aleksandr</creatorcontrib><creatorcontrib>Bashirov, Nail</creatorcontrib><creatorcontrib>Gasnikov, Alexander</creatorcontrib><title>The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation</title><description>In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, possibly with some stochastic noise. Assuming convex, and higher-order of smoothness of the objective function, this paper provides a zero-order accelerated stochastic gradient descent (ZO-AccSGD) method for solving this problem, which exploits the higher-order of smoothness information via kernel approximation. As theoretical results, we show that the ZO-AccSGD algorithm proposed in this paper improves the convergence results of state-of-the-art (SOTA) algorithms, namely the estimate of iteration complexity. In addition, our theoretical analysis provides an estimate of the maximum allowable noise level at which the desired accuracy can be achieved. Validation of our theoretical results is demonstrated both on the model function and on functions of interest in the field of machine learning. We also provide a discussion in which we explain the results obtained and the superiority of the proposed algorithm over SOTA algorithms for solving the original problem.</description><subject>Mathematics - Optimization and Control</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj71OwzAURr0woMIDMOEXSLFznThhSyv-1KIgkYklcuwbxcKpI9eqAk9PCUyf9A3n6BByw9laFFnG7lSY7WmdwvlgKUh-SbAZkG6c0p_Jxs-0nqId7beK1h_oW_Cdw_GefmDwSR0MBlppjQ6Dimjoe_R6UMdoNX3FOHhDT1bRHYYDOlpNU_CzHRfUFbnolTvi9f-uSPP40Gyfk3399LKt9onKJU9QIIqyLxnXUnai0JpL7BjLQPdCIiqR8wyZZn3HC8xN2bG0hNxkAAKg57Ait3_YpbOdwlkfvtrf3nbphR-FhVEN</recordid><startdate>20231003</startdate><enddate>20231003</enddate><creator>Lobanov, Aleksandr</creator><creator>Bashirov, Nail</creator><creator>Gasnikov, Alexander</creator><scope>AKZ</scope><scope>GOX</scope></search><sort><creationdate>20231003</creationdate><title>The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation</title><author>Lobanov, Aleksandr ; Bashirov, Nail ; Gasnikov, Alexander</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-e4ee49f901c77b48cc17eb0053cf47eea4615e0c0fb18e6d9b02936d533433f13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Mathematics - Optimization and Control</topic><toplevel>online_resources</toplevel><creatorcontrib>Lobanov, Aleksandr</creatorcontrib><creatorcontrib>Bashirov, Nail</creatorcontrib><creatorcontrib>Gasnikov, Alexander</creatorcontrib><collection>arXiv Mathematics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Lobanov, Aleksandr</au><au>Bashirov, Nail</au><au>Gasnikov, Alexander</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation</atitle><date>2023-10-03</date><risdate>2023</risdate><abstract>In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, possibly with some stochastic noise. Assuming convex, and higher-order of smoothness of the objective function, this paper provides a zero-order accelerated stochastic gradient descent (ZO-AccSGD) method for solving this problem, which exploits the higher-order of smoothness information via kernel approximation. As theoretical results, we show that the ZO-AccSGD algorithm proposed in this paper improves the convergence results of state-of-the-art (SOTA) algorithms, namely the estimate of iteration complexity. In addition, our theoretical analysis provides an estimate of the maximum allowable noise level at which the desired accuracy can be achieved. Validation of our theoretical results is demonstrated both on the model function and on functions of interest in the field of machine learning. We also provide a discussion in which we explain the results obtained and the superiority of the proposed algorithm over SOTA algorithms for solving the original problem.</abstract><doi>10.48550/arxiv.2310.02371</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2310.02371
ispartof
issn
language eng
recordid cdi_arxiv_primary_2310_02371
source arXiv.org
subjects Mathematics - Optimization and Control
title The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T08%3A53%3A34IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=The%20Black-Box%20Optimization%20Problem:%20Zero-Order%20Accelerated%20Stochastic%20Method%20via%20Kernel%20Approximation&rft.au=Lobanov,%20Aleksandr&rft.date=2023-10-03&rft_id=info:doi/10.48550/arxiv.2310.02371&rft_dat=%3Carxiv_GOX%3E2310_02371%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true