Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions

In this work, we propose an efficient minimax optimal global optimization algorithm for multivariate Lipschitz continuous functions. To evaluate the performance of our approach, we utilize the average regret instead of the traditional simple regret, which, as we show, is not suitable for use in the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Gokcesu, Kaan, Gokcesu, Hakan
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Gokcesu, Kaan
Gokcesu, Hakan
description In this work, we propose an efficient minimax optimal global optimization algorithm for multivariate Lipschitz continuous functions. To evaluate the performance of our approach, we utilize the average regret instead of the traditional simple regret, which, as we show, is not suitable for use in the multivariate non-convex optimization because of the inherent hardness of the problem itself. Since we study the average regret of the algorithm, our results directly imply a bound for the simple regret as well. Instead of constructing lower bounding proxy functions, our method utilizes a predetermined query creation rule, which makes it computationally superior to the Piyavskii-Shubert variants. We show that our algorithm achieves an average regret bound of $O(L\sqrt{n}T^{-\frac{1}{n}})$ for the optimization of an $n$-dimensional $L$-Lipschitz continuous objective in a time horizon $T$, which we show to be minimax optimal.
doi_str_mv 10.48550/arxiv.2206.02383
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2206_02383</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2206_02383</sourcerecordid><originalsourceid>FETCH-LOGICAL-a673-69226ada0e6598ea117760f9199a8db079f74e42fd85f887d5f24de527632c393</originalsourceid><addsrcrecordid>eNotj8tqwzAURLXpoqT9gK6iH7ArS9ZrWUySFhyyyTJgbmyJXHBlY8shzdfXcbs6DAzDHELeMpbmRkr2DsMNrynnTKWMCyOeyWnjPdboQqR7DPgNN3ro48yW7truPGOJeIeIXaCdpyX2Y33BeKdFFyKGqZtGup_aiFcYEKKj2ynUj_b4Qp48tKN7_eeKHLebY_GZlIfdV_FRJqC0SJTlXEEDzClpjYMs01oxbzNrwTRnpq3Xucu5b4z0xuhGep43TnKtBK-FFSuy_ptd9Kp-mO8PP9VDs1o0xS_xCU5r</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions</title><source>arXiv.org</source><creator>Gokcesu, Kaan ; Gokcesu, Hakan</creator><creatorcontrib>Gokcesu, Kaan ; Gokcesu, Hakan</creatorcontrib><description>In this work, we propose an efficient minimax optimal global optimization algorithm for multivariate Lipschitz continuous functions. To evaluate the performance of our approach, we utilize the average regret instead of the traditional simple regret, which, as we show, is not suitable for use in the multivariate non-convex optimization because of the inherent hardness of the problem itself. Since we study the average regret of the algorithm, our results directly imply a bound for the simple regret as well. Instead of constructing lower bounding proxy functions, our method utilizes a predetermined query creation rule, which makes it computationally superior to the Piyavskii-Shubert variants. We show that our algorithm achieves an average regret bound of $O(L\sqrt{n}T^{-\frac{1}{n}})$ for the optimization of an $n$-dimensional $L$-Lipschitz continuous objective in a time horizon $T$, which we show to be minimax optimal.</description><identifier>DOI: 10.48550/arxiv.2206.02383</identifier><language>eng</language><subject>Computer Science - Computational Complexity ; Computer Science - Learning ; Mathematics - Optimization and Control ; Statistics - Machine Learning</subject><creationdate>2022-06</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2206.02383$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2206.02383$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Gokcesu, Kaan</creatorcontrib><creatorcontrib>Gokcesu, Hakan</creatorcontrib><title>Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions</title><description>In this work, we propose an efficient minimax optimal global optimization algorithm for multivariate Lipschitz continuous functions. To evaluate the performance of our approach, we utilize the average regret instead of the traditional simple regret, which, as we show, is not suitable for use in the multivariate non-convex optimization because of the inherent hardness of the problem itself. Since we study the average regret of the algorithm, our results directly imply a bound for the simple regret as well. Instead of constructing lower bounding proxy functions, our method utilizes a predetermined query creation rule, which makes it computationally superior to the Piyavskii-Shubert variants. We show that our algorithm achieves an average regret bound of $O(L\sqrt{n}T^{-\frac{1}{n}})$ for the optimization of an $n$-dimensional $L$-Lipschitz continuous objective in a time horizon $T$, which we show to be minimax optimal.</description><subject>Computer Science - Computational Complexity</subject><subject>Computer Science - Learning</subject><subject>Mathematics - Optimization and Control</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8tqwzAURLXpoqT9gK6iH7ArS9ZrWUySFhyyyTJgbmyJXHBlY8shzdfXcbs6DAzDHELeMpbmRkr2DsMNrynnTKWMCyOeyWnjPdboQqR7DPgNN3ro48yW7truPGOJeIeIXaCdpyX2Y33BeKdFFyKGqZtGup_aiFcYEKKj2ynUj_b4Qp48tKN7_eeKHLebY_GZlIfdV_FRJqC0SJTlXEEDzClpjYMs01oxbzNrwTRnpq3Xucu5b4z0xuhGep43TnKtBK-FFSuy_ptd9Kp-mO8PP9VDs1o0xS_xCU5r</recordid><startdate>20220606</startdate><enddate>20220606</enddate><creator>Gokcesu, Kaan</creator><creator>Gokcesu, Hakan</creator><scope>AKY</scope><scope>AKZ</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20220606</creationdate><title>Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions</title><author>Gokcesu, Kaan ; Gokcesu, Hakan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a673-69226ada0e6598ea117760f9199a8db079f74e42fd85f887d5f24de527632c393</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Computational Complexity</topic><topic>Computer Science - Learning</topic><topic>Mathematics - Optimization and Control</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Gokcesu, Kaan</creatorcontrib><creatorcontrib>Gokcesu, Hakan</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Mathematics</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Gokcesu, Kaan</au><au>Gokcesu, Hakan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions</atitle><date>2022-06-06</date><risdate>2022</risdate><abstract>In this work, we propose an efficient minimax optimal global optimization algorithm for multivariate Lipschitz continuous functions. To evaluate the performance of our approach, we utilize the average regret instead of the traditional simple regret, which, as we show, is not suitable for use in the multivariate non-convex optimization because of the inherent hardness of the problem itself. Since we study the average regret of the algorithm, our results directly imply a bound for the simple regret as well. Instead of constructing lower bounding proxy functions, our method utilizes a predetermined query creation rule, which makes it computationally superior to the Piyavskii-Shubert variants. We show that our algorithm achieves an average regret bound of $O(L\sqrt{n}T^{-\frac{1}{n}})$ for the optimization of an $n$-dimensional $L$-Lipschitz continuous objective in a time horizon $T$, which we show to be minimax optimal.</abstract><doi>10.48550/arxiv.2206.02383</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2206.02383
ispartof
issn
language eng
recordid cdi_arxiv_primary_2206_02383
source arXiv.org
subjects Computer Science - Computational Complexity
Computer Science - Learning
Mathematics - Optimization and Control
Statistics - Machine Learning
title Efficient Minimax Optimal Global Optimization of Lipschitz Continuous Multivariate Functions
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T18%3A38%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Efficient%20Minimax%20Optimal%20Global%20Optimization%20of%20Lipschitz%20Continuous%20Multivariate%20Functions&rft.au=Gokcesu,%20Kaan&rft.date=2022-06-06&rft_id=info:doi/10.48550/arxiv.2206.02383&rft_dat=%3Carxiv_GOX%3E2206_02383%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true