A hybrid proximal generalized conditional gradient method and application to total variation parameter learning
In this paper we present a new method for solving optimization problems involving the sum of two proper, convex, lower semicontinuous functions, one of which has Lipschitz continuous gradient. The proposed method has a hybrid nature that combines the usual forward-backward and the generalized condit...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | In this paper we present a new method for solving optimization problems
involving the sum of two proper, convex, lower semicontinuous functions, one of
which has Lipschitz continuous gradient. The proposed method has a hybrid
nature that combines the usual forward-backward and the generalized conditional
gradient method. We establish a convergence rate of $o(k^{-1/3})$ under mild
assumptions with a specific step-size rule and show an application to a total
variation parameter learning problem, which demonstrates its benefits in the
context of nonsmooth convex optimization. |
---|---|
DOI: | 10.48550/arxiv.2211.00997 |