The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation
In this paper, we study the standard formulation of an optimization problem when the computation of gradient is not available. Such a problem can be classified as a "black box" optimization problem, since the oracle returns only the value of the objective function at the requested point, p...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper, we study the standard formulation of an optimization problem
when the computation of gradient is not available. Such a problem can be
classified as a "black box" optimization problem, since the oracle returns only
the value of the objective function at the requested point, possibly with some
stochastic noise. Assuming convex, and higher-order of smoothness of the
objective function, this paper provides a zero-order accelerated stochastic
gradient descent (ZO-AccSGD) method for solving this problem, which exploits
the higher-order of smoothness information via kernel approximation. As
theoretical results, we show that the ZO-AccSGD algorithm proposed in this
paper improves the convergence results of state-of-the-art (SOTA) algorithms,
namely the estimate of iteration complexity. In addition, our theoretical
analysis provides an estimate of the maximum allowable noise level at which the
desired accuracy can be achieved. Validation of our theoretical results is
demonstrated both on the model function and on functions of interest in the
field of machine learning. We also provide a discussion in which we explain the
results obtained and the superiority of the proposed algorithm over SOTA
algorithms for solving the original problem. |
---|---|
DOI: | 10.48550/arxiv.2310.02371 |