Numerical experience with a class of self-scaling quasi-Newton algorithms

Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of optimization theory and applications 1998-03, Vol.96 (3), p.533-553
1. Verfasser: AL-BAALI, M
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Self-scaling quasi-Newton methods for unconstrained optimization depend upon updating the Hessian approximation by a formula which depends on two parameters (say, τ and θ) such that τ = 1, θ = 0, and θ = 1 yield the unscaled Broyden family, the BFGS update, and the DFP update, respectively. In previous work, conditions were obtained on these parameters that imply global and superlinear convergence for self-scaling methods on convex objective functions. This paper discusses the practical performance of several new algorithms designed to satisfy these conditions.
ISSN:0022-3239
1573-2878
DOI:10.1023/A:1022608410710