Majorization minimization by coordinate descent for concave penalized generalized linear models

被引:13
|
作者
Jiang, Dingfeng [1 ]
Huang, Jian [2 ,3 ]
机构
[1] AbbVie Inc, Exploratory Stat Data & Stat Sci, N Chicago, IL 60064 USA
[2] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA 52242 USA
[3] Univ Iowa, Dept Biostat, Iowa City, IA USA
关键词
Logistic regression; p >> n models; Smoothly clipped absolute deviation penalty; Minimax concave penalty; Variable selection; VARIABLE SELECTION; ALGORITHMS; LIKELIHOOD; REGRESSION;
D O I
10.1007/s11222-013-9407-3
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recent studies have demonstrated theoretical attractiveness of a class of concave penalties in variable selection, including the smoothly clipped absolute deviation and minimax concave penalties. The computation of the concave penalized solutions in high-dimensional models, however, is a difficult task. We propose a majorization minimization by coordinate descent (MMCD) algorithm for computing the concave penalized solutions in generalized linear models. In contrast to the existing algorithms that use local quadratic or local linear approximation to the penalty function, the MMCD seeks to majorize the negative log-likelihood by a quadratic loss, but does not use any approximation to the penalty. This strategy makes it possible to avoid the computation of a scaling factor in each update of the solutions, which improves the efficiency of coordinate descent. Under certain regularity conditions, we establish theoretical convergence property of the MMCD. We implement this algorithm for a penalized logistic regression model using the SCAD and MCP penalties. Simulation studies and a data example demonstrate that the MMCD works sufficiently fast for the penalized logistic regression in high-dimensional settings where the number of covariates is much larger than the sample size.
引用
收藏
页码:871 / 883
页数:13
相关论文
共 50 条
  • [41] A coordinate gradient descent method for nonsmooth separable minimization
    Paul Tseng
    Sangwoon Yun
    [J]. Mathematical Programming, 2009, 117 : 387 - 423
  • [42] ON THE CONVERGENCE OF THE COORDINATE DESCENT METHOD FOR CONVEX DIFFERENTIABLE MINIMIZATION
    LUO, ZQ
    TSENG, P
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1992, 72 (01) : 7 - 35
  • [43] A Coordinate Gradient Descent Method for Nonsmooth Nonseparable Minimization
    Michael K.Ng
    [J]. Numerical Mathematics(Theory,Methods and Applications), 2009, (04) : 377 - 402
  • [44] A Coordinate Gradient Descent Method for Nonsmooth Nonseparable Minimization
    Bai, Zheng-Jian
    Ng, Michael K.
    Qi, Liqun
    [J]. NUMERICAL MATHEMATICS-THEORY METHODS AND APPLICATIONS, 2009, 2 (04) : 377 - 402
  • [45] A coordinate gradient descent method for nonsmooth separable minimization
    Tseng, Paul
    Yun, Sangwoon
    [J]. MATHEMATICAL PROGRAMMING, 2009, 117 (1-2) : 387 - 423
  • [46] Generalized γ-valid cut procedure for concave minimization
    Benson, HP
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1999, 102 (02) : 289 - 298
  • [47] Generalized γ-Valid Cut Procedure for Concave Minimization
    H. P. Benson
    [J]. Journal of Optimization Theory and Applications, 1999, 102 : 289 - 298
  • [48] FULLY POLYNOMIAL-TIME RANDOMIZED APPROXIMATION SCHEMES FOR GLOBAL OPTIMIZATION OF HIGH-DIMENSIONAL MINIMAX CONCAVE PENALIZED GENERALIZED LINEAR MODELS
    Hernandez, Charles
    Lee, Hung-Yi
    Tong, Jindong
    Liu, Hongcheng
    [J]. Journal of Nonlinear and Variational Analysis, 2024, 8 (06): : 909 - 934
  • [49] Convergence of Generalized Linear Coordinate-Descent Message-Passing for Quadratic Optimization
    Zhang, Guoqiang
    Heusdens, Richard
    [J]. 2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [50] A majorization-minimization approach to the sparse generalized eigenvalue problem
    Bharath K. Sriperumbudur
    David A. Torres
    Gert R. G. Lanckriet
    [J]. Machine Learning, 2011, 85 : 3 - 39