Mesh adaptive direct search with second directional derivative-based Hessian update
The subject of this paper is inequality constrained black-box optimization with mesh adaptive direct search (MADS). The MADS search step can include additional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involves buildin...
Gespeichert in:
Veröffentlicht in: | Computational optimization and applications 2015-12, Vol.62 (3), p.693-715 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The subject of this paper is inequality constrained black-box optimization with mesh adaptive direct search (MADS). The MADS search step can include additional strategies for accelerating the convergence and improving the accuracy of the solution. The strategy proposed in this paper involves building a quadratic model of the function and linear models of the constraints. The quadratic model is built by means of a second directional derivative-based Hessian update. The linear terms are obtained by linear regression. The resulting quadratic programming (QP) problem is solved with a dedicated solver and the original functions are evaluated at the QP solution. The proposed search strategy is computationally less expensive than the quadratically constrained QP strategy in the state of the art MADS implementation (NOMAD). The proposed MADS variant (QPMADS) and NOMAD are compared on four sets of test problems. QPMADS outperforms NOMAD on all four of them for all but the smallest computational budgets. |
---|---|
ISSN: | 0926-6003 1573-2894 |
DOI: | 10.1007/s10589-015-9753-5 |