Decomposition in derivative-free optimization
This paper proposes a novel decomposition framework for derivative-free optimization (DFO) algorithms. Our framework significantly extends the scope of current DFO solvers to larger-scale problems. We show that the proposed framework closely relates to the superiorization methodology that is traditi...
Gespeichert in:
Veröffentlicht in: | Journal of global optimization 2021-10, Vol.81 (2), p.269-292 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | This paper proposes a novel decomposition framework for derivative-free optimization (DFO) algorithms. Our framework significantly extends the scope of current DFO solvers to larger-scale problems. We show that the proposed framework closely relates to the superiorization methodology that is traditionally used for improving the efficiency of feasibility-seeking algorithms for constrained optimization problems in a derivative-based setting. We analyze the convergence behavior of the framework in the context of global search algorithms. A practical implementation is developed and exemplified with the global model-based solver Stable Noisy Optimization by Branch and Fit (SNOBFIT) [
36
]. To investigate the decomposition framework’s performance, we conduct extensive computational studies on a collection of over 300 test problems of varying dimensions and complexity. We observe significant improvements in the quality of solutions for a large fraction of the test problems. Regardless of problem convexity and smoothness, decomposition leads to over 50% improvement in the objective function after 2500 function evaluations for over 90% of our test problems with more than 75 variables. |
---|---|
ISSN: | 0925-5001 1573-2916 |
DOI: | 10.1007/s10898-021-01051-w |