A coordinate gradient descent method for nonsmooth separable minimization

被引:21
|
作者
Paul Tseng
Sangwoon Yun
机构
[1] University of Washington,Department of Mathematics
来源
Mathematical Programming | 2009年 / 117卷
关键词
Error bound; Global convergence; Linear convergence rate; Nonsmooth optimization; Coordinate descent; 49M27; 49M37; 65K05; 90C06; 90C25; 90C26; 90C30; 90C55;
D O I
暂无
中图分类号
学科分类号
摘要
We consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as special cases bound-constrained optimization and smooth optimization with ℓ1-regularization. We propose a (block) coordinate gradient descent method for solving this class of nonsmooth separable problems. We establish global convergence and, under a local Lipschitzian error bound assumption, linear convergence for this method. The local Lipschitzian error bound holds under assumptions analogous to those for constrained smooth optimization, e.g., the convex function is polyhedral and the smooth function is (nonconvex) quadratic or is the composition of a strongly convex function with a linear mapping. We report numerical experience with solving the ℓ1-regularization of unconstrained optimization problems from Moré et al. in ACM Trans. Math. Softw. 7, 17–41, 1981 and from the CUTEr set (Gould and Orban in ACM Trans. Math. Softw. 29, 373–394, 2003). Comparison with L-BFGS-B and MINOS, applied to a reformulation of the ℓ1-regularized problem as a bound-constrained optimization problem, is also reported.
引用
收藏
页码:387 / 423
页数:36
相关论文
共 50 条