Linear Convergence of Prox-SVRG Method for Separable Non-smooth Convex Optimization Problems under Bounded Metric Subregularity

被引:1
|
作者
Zhang, Jin [1 ]
Zhu, Xide [2 ]
机构
[1] Southern Univ Sci & Technol, Natl Ctr Appl Math Shenzhen, Dept Math, Shenzhen, Peoples R China
[2] Shanghai Univ, Sch Management, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
Linear convergence; Bounded metric subregularity; Calmness; Proximal stochastic variance-reduced gradient; Randomized block-coordinate proximal gradient; COORDINATE DESCENT METHODS; REGRESSION; REGULARITY; SELECTION; PARALLEL;
D O I
10.1007/s10957-021-01978-w
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
With the help of bounded metric subregularity which is weaker than strong convexity, we show the linear convergence of proximal stochastic variance-reduced gradient (Prox-SVRG) method for solving a class of separable non-smooth convex optimization problems where the smooth item is a composite of strongly convex function and linear function. We introduce an equivalent characterization for the bounded metric subregularity by taking into account the calmness condition of a perturbed linear system. This equivalent characterization allows us to provide a verifiable sufficient condition to ensure linear convergence of Prox-SVRG and randomized block-coordinate proximal gradient methods. Furthermore, we verify that these sufficient conditions hold automatically when the non-smooth item is the generalized sparse group Lasso regularizer.
引用
收藏
页码:564 / 597
页数:34
相关论文
共 50 条