Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization

We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell- q...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of optimization theory and applications 2009-03, Vol.140 (3), p.513-535
Hauptverfasser: Tseng, P., Yun, S.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell- q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O ( n 2 / ε ) iterations with an ε -optimal solution. If P is separable, then the Gauss-Southwell- q rule is implementable in O ( n ) operations when m =1 and in O ( n 2 ) operations when m >1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m =1, this complexity bound is comparable to the best known bound for decomposition methods. If f is convex, then, by gradually reducing the weight on P to zero, the method can be adapted to solve the bilevel problem of minimizing P over the set of minima of f + δ X , where X denotes the closure of the feasible set. This has application in the least 1-norm solution of maximum-likelihood estimation.
ISSN:0022-3239
1573-2878
DOI:10.1007/s10957-008-9458-3