Noisy Optimization: Fast Convergence Rates with Comparison-Based Algorithms

Derivative Free Optimization is known to be an efficient and robust method to tackle the black-box optimization problem. When it comes to noisy functions, classical comparison-based algorithms are slower than gradient-based algorithms. For quadratic functions, Evolutionary Algorithms without large m...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2016-04
Hauptverfasser: Marie-Liesse Cauwet, Teytaud, Olivier
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Derivative Free Optimization is known to be an efficient and robust method to tackle the black-box optimization problem. When it comes to noisy functions, classical comparison-based algorithms are slower than gradient-based algorithms. For quadratic functions, Evolutionary Algorithms without large mutations have a simple regret at best \(O(1/ \sqrt{N})\) when \(N\) is the number of function evaluations, whereas stochastic gradient descent can reach (tightly) a simple regret in \(O(1/N)\). It has been conjectured that gradient approximation by finite differences (hence, not a comparison-based method) is necessary for reaching such a \(O(1/N)\). We answer this conjecture in the negative, providing a comparison-based algorithm as good as gradient methods, i.e. reaching \(O(1/N)\) - under the condition, however, that the noise is Gaussian. Experimental results confirm the \(O(1/N)\) simple regret, i.e., squared rate compared to many published results at \(O(1/\sqrt{N})\).
ISSN:2331-8422