A conjugate gradient sampling method for nonsmooth optimization

We present an algorithm for minimizing locally Lipschitz functions being continuously differentiable in an open dense subset of R n . The function may be nonsmooth and/or nonconvex. The method makes use of a gradient sampling method along with a conjugate gradient scheme. To find search directions,...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:4OR 2020-03, Vol.18 (1), p.73-90
Hauptverfasser: Mahdavi-Amiri, N., Shaeiri, M.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present an algorithm for minimizing locally Lipschitz functions being continuously differentiable in an open dense subset of R n . The function may be nonsmooth and/or nonconvex. The method makes use of a gradient sampling method along with a conjugate gradient scheme. To find search directions, we make use of a sequence of positive definite approximate Hessians based on conjugate gradient matrices. The algorithm benefits from a restart procedure to improve upon poor search directions or to make sure that the approximate Hessians remain bounded. The global convergence of the algorithm is established. An implementation of the algorithm is executed on a collection of well-known test problems. Comparative numerical results clearly show outperformance of the algorithm over some recent well-known nonsmooth algorithms using the Dolan–Moré performance profiles.
ISSN:1619-4500
1614-2411
DOI:10.1007/s10288-019-00404-2