Majorization minimization by coordinate descent for concave penalized generalized linear models

被引:14
|
作者
Jiang, Dingfeng [1 ]
Huang, Jian [2 ,3 ]
机构
[1] AbbVie Inc, Exploratory Stat Data & Stat Sci, N Chicago, IL 60064 USA
[2] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA 52242 USA
[3] Univ Iowa, Dept Biostat, Iowa City, IA USA
关键词
Logistic regression; p >> n models; Smoothly clipped absolute deviation penalty; Minimax concave penalty; Variable selection; VARIABLE SELECTION; ALGORITHMS; LIKELIHOOD; REGRESSION;
D O I
10.1007/s11222-013-9407-3
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recent studies have demonstrated theoretical attractiveness of a class of concave penalties in variable selection, including the smoothly clipped absolute deviation and minimax concave penalties. The computation of the concave penalized solutions in high-dimensional models, however, is a difficult task. We propose a majorization minimization by coordinate descent (MMCD) algorithm for computing the concave penalized solutions in generalized linear models. In contrast to the existing algorithms that use local quadratic or local linear approximation to the penalty function, the MMCD seeks to majorize the negative log-likelihood by a quadratic loss, but does not use any approximation to the penalty. This strategy makes it possible to avoid the computation of a scaling factor in each update of the solutions, which improves the efficiency of coordinate descent. Under certain regularity conditions, we establish theoretical convergence property of the MMCD. We implement this algorithm for a penalized logistic regression model using the SCAD and MCP penalties. Simulation studies and a data example demonstrate that the MMCD works sufficiently fast for the penalized logistic regression in high-dimensional settings where the number of covariates is much larger than the sample size.
引用
收藏
页码:871 / 883
页数:13
相关论文
共 50 条
  • [1] Majorization minimization by coordinate descent for concave penalized generalized linear models
    Dingfeng Jiang
    Jian Huang
    [J]. Statistics and Computing, 2014, 24 : 871 - 883
  • [2] Coordinate majorization descent algorithm for nonconvex penalized regression
    Wang, Yanxin
    Zhu, Li
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2021, 91 (13) : 2684 - 2698
  • [3] A coordinate majorization descent algorithm for l1 penalized learning
    Yang, Yi
    Zou, Hui
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2014, 84 (01) : 84 - 95
  • [4] Distributed coordinate descent for generalized linear models with regularization
    Trofimov I.
    Genkin A.
    [J]. Pattern Recognition and Image Analysis, 2017, 27 (2) : 349 - 364
  • [5] Regularization Paths for Generalized Linear Models via Coordinate Descent
    Friedman, Jerome
    Hastie, Trevor
    Tibshirani, Rob
    [J]. JOURNAL OF STATISTICAL SOFTWARE, 2010, 33 (01): : 1 - 22
  • [6] MINIMIZATION BY COORDINATE DESCENT
    ABATZOGLOU, T
    ODONNELL, B
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1982, 36 (02) : 163 - 174
  • [7] Generalized Majorization-Minimization
    Naderi, Sobhan
    He, Kun
    Aghajani, Reza
    Sclaroff, Stan
    Felzenszwalb, Pedro
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [8] COORDINATE DESCENT ALGORITHMS FOR LASSO PENALIZED REGRESSION
    Wu, Tong Tong
    Lange, Kenneth
    [J]. ANNALS OF APPLIED STATISTICS, 2008, 2 (01): : 224 - 244
  • [9] Majorization-Minimization algorithms for nonsmoothly penalized objective functions
    Schifano, Elizabeth D.
    Strawderman, Robert L.
    Wells, Martin T.
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2010, 4 : 1258 - 1299
  • [10] Penalized Bregman Divergence Estimation via Coordinate Descent
    Zhang, Chunming
    Zhang, Zhengjun
    Chai, Yi
    [J]. JIRSS-JOURNAL OF THE IRANIAN STATISTICAL SOCIETY, 2011, 10 (02): : 125 - 140