Analyzing the speed of convergence in nonsmooth optimization via the Goldstein subdifferential with application to descent methods
The Goldstein $\varepsilon$-subdifferential is a relaxed version of the Clarke subdifferential which has recently appeared in several algorithms for nonsmooth optimization. With it comes the notion of $(\varepsilon,\delta)$-critical points, which are points in which the element with the smallest nor...
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The Goldstein $\varepsilon$-subdifferential is a relaxed version of the
Clarke subdifferential which has recently appeared in several algorithms for
nonsmooth optimization. With it comes the notion of
$(\varepsilon,\delta)$-critical points, which are points in which the element
with the smallest norm in the $\varepsilon$-subdifferential has norm at most
$\delta$. To obtain points that are critical in the classical sense,
$\varepsilon$ and $\delta$ must vanish. In this article, we analyze at which
speed the distance of $(\varepsilon,\delta)$-critical points to the minimum
vanishes with respect to $\varepsilon$ and $\delta$. Afterwards, we apply our
results to gradient sampling methods and perform numerical experiments.
Throughout the article, we put a special emphasis on supporting the theoretical
results with simple examples that visualize them. |
---|---|
DOI: | 10.48550/arxiv.2410.01382 |