Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization

被引:0
|
作者
P. Tseng
S. Yun
机构
[1] University of Washington,Department of Mathematics
[2] National University of Singapore,Department of Mathematics
关键词
Nonsmooth optimization; Linear constraints; Support vector machines; Bilevel optimization; -regularization; Coordinate gradient descent; Global convergence; Linear convergence rate; Complexity bound;
D O I
暂无
中图分类号
学科分类号
摘要
We consider the problem of minimizing the weighted sum of a smooth function f and a convex function P of n real variables subject to m linear equality constraints. We propose a block-coordinate gradient descent method for solving this problem, with the coordinate block chosen by a Gauss-Southwell-q rule based on sufficient predicted descent. We establish global convergence to first-order stationarity for this method and, under a local error bound assumption, linear rate of convergence. If f is convex with Lipschitz continuous gradient, then the method terminates in O(n2/ε) iterations with an ε-optimal solution. If P is separable, then the Gauss-Southwell-q rule is implementable in O(n) operations when m=1 and in O(n2) operations when m>1. In the special case of support vector machines training, for which f is convex quadratic, P is separable, and m=1, this complexity bound is comparable to the best known bound for decomposition methods. If f is convex, then, by gradually reducing the weight on P to zero, the method can be adapted to solve the bilevel problem of minimizing P over the set of minima of f+δX, where X denotes the closure of the feasible set. This has application in the least 1-norm solution of maximum-likelihood estimation.
引用
收藏
相关论文
共 50 条