Improvement of the Nelder-Mead method using Direct Inversion in Iterative Subspace

The Nelder-Mead (NM) method is a popular derivative-free optimization algorithm owing to its fast convergence and robustness. However, it is known that the method often fails to converge or costs a long time for a large-scale optimization. In the present study, the NM method has been improved using...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Optimization and engineering 2022-06, Vol.23 (2), p.1033-1055
Hauptverfasser: Kitaoka, Haru, Amano, Ken-ichi, Nishi, Naoya, Sakka, Tetsuo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The Nelder-Mead (NM) method is a popular derivative-free optimization algorithm owing to its fast convergence and robustness. However, it is known that the method often fails to converge or costs a long time for a large-scale optimization. In the present study, the NM method has been improved using direct inversion in iterative subspace (DIIS). DIIS is a technique to accelerate an optimization method, extrapolating a better intermediate solution from linear-combination of the known ones. We compared runtimes of the new method (NM-DIIS) and the conventional NM method using unimodal test functions with various dimensions. The NM-DIIS method showed better results than the original NM on average when the dimension of the objective function is high. Long tails of the runtime distributions in the NM method have disappeared when DIIS was applied. DIIS has also been implemented in the quasi-gradient method, which is an improved version of the NM method developed by Pham et al. [IEEE Trans. Ind. Informatics, 7 (2011) 592]. The combined method also performed well especially in an upwardly convex test function. The present study proposes a practical optimization strategy and proves the versatility of DIIS.
ISSN:1389-4420
1573-2924
DOI:10.1007/s11081-021-09620-4