Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization

被引:1
|
作者
Lee, Ching-pei [1 ,2 ]
Wright, Stephen J. [3 ,4 ]
机构
[1] Natl Univ Singapore, Dept Math, Singapore, Singapore
[2] Natl Univ Singapore, Inst Math Sci, Singapore, Singapore
[3] Univ Wisconsin, Comp Sci Dept, Madison, WI USA
[4] Univ Wisconsin, Wisconsin Inst Discovery, Madison, WI USA
基金
美国国家科学基金会;
关键词
Variable metric; Stochastic coordinate descent; Regularized optimization; Inexact method; Arbitrary sampling; REGRESSION; COMPLEXITY; SHRINKAGE; SELECTION;
D O I
10.1007/s10957-020-01639-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Block-coordinate descent is a popular framework for large-scale regularized optimization problems with block-separable structure. Existing methods have several limitations. They often assume that subproblems can be solved exactly at each iteration, which in practical terms usually restricts the quadratic term in the subproblem to be diagonal, thus losing most of the benefits of higher-order derivative information. Moreover, in contrast to the smooth case, non-uniform sampling of the blocks has not yet been shown to improve the convergence rate bounds for regularized problems. This work proposes an inexact randomized block-coordinate descent method based on a regularized quadratic subproblem, in which the quadratic term can vary from iteration to iteration: a "variable metric." We provide a detailed convergence analysis for both convex and non-convex problems. Our analysis generalizes, to the regularized case, Nesterov's proposal for improving convergence of block-coordinate descent by sampling proportional to the blockwise Lipschitz constants. We improve the convergence rate in the convex case by weakening the dependency on the initial objective value. Empirical results also show that significant benefits accrue from the use of a variable metric.
引用
收藏
页码:151 / 187
页数:37
相关论文
共 50 条
  • [1] Inexact Variable Metric Stochastic Block-Coordinate Descent for Regularized Optimization
    Ching-pei Lee
    Stephen J. Wright
    [J]. Journal of Optimization Theory and Applications, 2020, 185 : 151 - 187
  • [2] ACCELERATED BLOCK-COORDINATE RELAXATION FOR REGULARIZED OPTIMIZATION
    Wright, Stephen J.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2012, 22 (01) : 159 - 186
  • [3] Asynchronous Incremental Block-Coordinate Descent
    Aytekin, Arda
    Feyzmahdavian, Hamid Reza
    Johansson, Mikael
    [J]. 2014 52ND ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2014, : 19 - 24
  • [4] Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization
    Yang, Yang
    Pesavento, Marius
    Luo, Zhi-Quan
    Ottersten, Bjorn
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2020, 68 : 947 - 961
  • [5] On the convergence of inexact block coordinate descent methods for constrained optimization
    Cassioli, A.
    Di Lorenzo, D.
    Sciandrone, M.
    [J]. EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2013, 231 (02) : 274 - 281
  • [6] Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization
    P. Tseng
    S. Yun
    [J]. Journal of Optimization Theory and Applications, 2009, 140
  • [7] Efficient block-coordinate descent algorithms for the Group Lasso
    Qin Z.
    Scheinberg K.
    Goldfarb D.
    [J]. Qin, Z. (zq2107@columbia.edu), 2013, Springer Verlag (05) : 143 - 169
  • [8] Block-Coordinate Gradient Descent Method for Linearly Constrained Nonsmooth Separable Optimization
    Tseng, P.
    Yun, S.
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2009, 140 (03) : 513 - 535
  • [9] Block-coordinate descent and local consistencies in linear programming
    Dlask, Tomas
    [J]. CONSTRAINTS, 2023, 28 (02) : 69 - 70
  • [10] On the complexity analysis of randomized block-coordinate descent methods
    Lu, Zhaosong
    Xiao, Lin
    [J]. MATHEMATICAL PROGRAMMING, 2015, 152 (1-2) : 615 - 642