A Limited-Memory Quasi-Newton Algorithm for Bound-Constrained Nonsmooth Optimization
We consider the problem of minimizing a continuous function that may be nonsmooth and nonconvex, subject to bound constraints. We propose an algorithm that uses the L-BFGS quasi-Newton approximation of the problem's curvature together with a variant of the weak Wolfe line search. The key ingred...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We consider the problem of minimizing a continuous function that may be
nonsmooth and nonconvex, subject to bound constraints. We propose an algorithm
that uses the L-BFGS quasi-Newton approximation of the problem's curvature
together with a variant of the weak Wolfe line search. The key ingredient of
the method is an active-set selection strategy that defines the subspace in
which search directions are computed. To overcome the inherent shortsightedness
of the gradient for a nonsmooth function, we propose two strategies. The first
relies on an approximation of the $\epsilon$-minimum norm subgradient, and the
second uses an iterative corrective loop that augments the active set based on
the resulting search directions. We describe a Python implementation of the
proposed algorithm and present numerical results on a set of standard test
problems to illustrate the efficacy of our approach. |
---|---|
DOI: | 10.48550/arxiv.1612.07350 |