On the Hardness of Meaningful Local Guarantees in Nonsmooth Nonconvex Optimization
We study the oracle complexity of nonsmooth nonconvex optimization, with the algorithm assumed to have access only to local function information. It has been shown by Davis, Drusvyatskiy, and Jiang (2023) that for nonsmooth Lipschitz functions satisfying certain regularity and strictness conditions,...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | We study the oracle complexity of nonsmooth nonconvex optimization, with the
algorithm assumed to have access only to local function information. It has
been shown by Davis, Drusvyatskiy, and Jiang (2023) that for nonsmooth
Lipschitz functions satisfying certain regularity and strictness conditions,
perturbed gradient descent converges to local minimizers asymptotically.
Motivated by this result and by other recent algorithmic advances in nonconvex
nonsmooth optimization concerning Goldstein stationarity, we consider the
question of obtaining a non-asymptotic rate of convergence to local minima for
this problem class.
We provide the following negative answer to this question: Local algorithms
acting on regular Lipschitz functions cannot, in the worst case, provide
meaningful local guarantees in terms of function value in sub-exponential time,
even when all near-stationary points are global minima. This sharply contrasts
with the smooth setting, for which it is well-known that standard gradient
methods can do so in a dimension-independent rate. Our result complements the
rich body of work in the theoretical computer science literature that provide
hardness results conditional on conjectures such as $\mathsf{P}\neq\mathsf{NP}$
or cryptographic assumptions, in that ours holds unconditional of any such
assumptions. |
---|---|
DOI: | 10.48550/arxiv.2409.10323 |