A coordinate gradient descent method for a"" 1-regularized convex minimization

被引:68
|
作者
Yun, Sangwoon [1 ]
Toh, Kim-Chuan [1 ,2 ]
机构
[1] Singapore MIT Alliance, Singapore 117576, Singapore
[2] Natl Univ Singapore, Dept Math, Singapore 117543, Singapore
关键词
Coordinate gradient descent; Q-linear convergence; l(1)-Regularization; Compressed sensing; Image deconvolution; Linear least squares; Logistic regression; Convex optimization; ALGORITHM; RECONSTRUCTION; REGRESSION; ROBUST;
D O I
10.1007/s10589-009-9251-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In applications such as signal processing and statistics, many problems involve finding sparse solutions to under-determined linear systems of equations. These problems can be formulated as a structured nonsmooth optimization problems, i.e., the problem of minimizing a"" (1)-regularized linear least squares problems. In this paper, we propose a block coordinate gradient descent method (abbreviated as CGD) to solve the more general a"" (1)-regularized convex minimization problems, i.e., the problem of minimizing an a"" (1)-regularized convex smooth function. We establish a Q-linear convergence rate for our method when the coordinate block is chosen by a Gauss-Southwell-type rule to ensure sufficient descent. We propose efficient implementations of the CGD method and report numerical results for solving large-scale a"" (1)-regularized linear least squares problems arising in compressed sensing and image deconvolution as well as large-scale a"" (1)-regularized logistic regression problems for feature selection in data classification. Comparison with several state-of-the-art algorithms specifically designed for solving large-scale a"" (1)-regularized linear least squares or logistic regression problems suggests that an efficiently implemented CGD method may outperform these algorithms despite the fact that the CGD method is not specifically designed just to solve these special classes of problems.
引用
收藏
页码:273 / 307
页数:35
相关论文
共 50 条
  • [41] An optimal gradient method for smooth strongly convex minimization
    Adrien Taylor
    Yoel Drori
    [J]. Mathematical Programming, 2023, 199 : 557 - 594
  • [42] A second-order gradient method for convex minimization
    Oviedo, Harry
    [J]. BOLETIN DE LA SOCIEDAD MATEMATICA MEXICANA, 2021, 27 (03):
  • [43] Convergence of the gradient projection method for generalized convex minimization
    Wang, CY
    Xiu, NH
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2000, 16 (02) : 111 - 120
  • [44] GENERALIZING THE OPTIMIZED GRADIENT METHOD FOR SMOOTH CONVEX MINIMIZATION
    Kim, Donghwan
    Fessler, Jeffrey A.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1920 - 1950
  • [45] A second-order gradient method for convex minimization
    Harry Oviedo
    [J]. Boletín de la Sociedad Matemática Mexicana, 2021, 27
  • [46] Distributed Coordinate Descent for L1-regularized Logistic Regression
    Trofimov, Ilya
    Genkin, Alexander
    [J]. ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS, AIST 2015, 2015, 542 : 243 - 254
  • [47] An optimal gradient method for smooth strongly convex minimization
    Taylor, Adrien
    Drori, Yoel
    [J]. MATHEMATICAL PROGRAMMING, 2023, 199 (1-2) : 557 - 594
  • [48] A cyclic block coordinate descent method with generalized gradient projections
    Bonettini, Silvia
    Prato, Marco
    Rebegoldi, Simone
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2016, 286 : 288 - 300
  • [49] Adaptive Stochastic Gradient Descent Method for Convex and Non-Convex Optimization
    Chen, Ruijuan
    Tang, Xiaoquan
    Li, Xiuting
    [J]. FRACTAL AND FRACTIONAL, 2022, 6 (12)
  • [50] On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization
    Li, Xingguo
    Zhao, Tuo
    Arora, Raman
    Liu, Han
    Hong, Mingyi
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18