Rates of superlinear convergence for classical quasi-Newton methods

We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-06
Hauptverfasser: Rodomanov, Anton, Nesterov, Yurii
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We study the local convergence of classical quasi-Newton methods for nonlinear optimization. Although it was well established a long time ago that asymptotically these methods converge superlinearly, the corresponding rates of convergence still remain unknown. In this paper, we address this problem. We obtain first explicit non-asymptotic rates of superlinear convergence for the standard quasi-Newton methods, which are based on the updating formulas from the convex Broyden class. In particular, for the well-known DFP and BFGS methods, we obtain the rates of the form \((\frac{n L^2}{\mu^2 k})^{k/2}\) and \((\frac{n L}{\mu k})^{k/2}\) respectively, where \(k\) is the iteration counter, \(n\) is the dimension of the problem, \(\mu\) is the strong convexity parameter, and \(L\) is the Lipschitz constant of the gradient.
ISSN:2331-8422
DOI:10.48550/arxiv.2003.09174