Robust Coordinate Descent Algorithm Robust Solution Path for High-dimensional Sparse Regression Modeling

被引:4
|
作者
Park, H. [1 ]
Konishi, S. [2 ]
机构
[1] Univ Tokyo, Inst Med Sci, Ctr Human Genome, Minato Ku, Tokyo 1128551, Japan
[2] Chuo Univ, Dept Math, Fac Sci & Engn, Bunkyo Ku, Tokyo, Japan
关键词
Coordinate descent algorithm; Dimension reduction; High-dimensional data; L-1-type regularization; Robust regression modeling; LEAST ANGLE REGRESSION; LARGE DATA SETS; PENALIZED REGRESSION; VARIABLE SELECTION; REGULARIZATION; LASSO;
D O I
10.1080/03610918.2013.854910
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The L-1-type regularization provides a useful tool for variable selection in high-dimensional regression modeling. Various algorithms have been proposed to solve optimization problems for L-1-type regularization. Especially the coordinate descent algorithm has been shown to be effective in sparse regression modeling. Although the algorithm shows a remarkable performance to solve optimization problems for L-1-type regularization, it suffers from outliers, since the procedure is based on the inner product of predictor variables and partial residuals obtained from a non-robust manner. To overcome this drawback, we propose a robust coordinate descent algorithm, especially focusing on the high-dimensional regression modeling based on the principal components space. We show that the proposed robust algorithm converges to the minimum value of its objective function. Monte Carlo experiments and real data analysis are conducted to examine the efficiency of the proposed robust algorithm. We observe that our robust coordinate descent algorithm effectively performs for the high-dimensional regression modeling even in the presence of outliers.
引用
收藏
页码:115 / 129
页数:15
相关论文
共 50 条
  • [21] High-dimensional robust regression with Lq-loss functions
    Wang, Yibo
    Karunamuni, Rohana J.
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2022, 176
  • [22] High-dimensional robust regression with Lq-loss functions
    Wang, Yibo
    Karunamuni, Rohana J.
    [J]. Computational Statistics and Data Analysis, 2022, 176
  • [23] Sparse High-Dimensional Isotonic Regression
    Gamarnik, David
    Gaudio, Julia
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [24] Robust and sparse k-means clustering for high-dimensional data
    Brodinova, Sarka
    Filzmoser, Peter
    Ortner, Thomas
    Breiteneder, Christian
    Rohm, Maia
    [J]. ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2019, 13 (04) : 905 - 932
  • [25] Robust and sparse learning of varying coefficient models with high-dimensional features
    Xiong, Wei
    Tian, Maozai
    Tang, Manlai
    Pan, Han
    [J]. JOURNAL OF APPLIED STATISTICS, 2023, 50 (16) : 3312 - 3336
  • [26] Robust high-dimensional screening
    Kim, Aleksandra
    Mutel, Christopher
    Froemelt, Andreas
    [J]. ENVIRONMENTAL MODELLING & SOFTWARE, 2022, 148
  • [27] Robust sparse precision matrix estimation for high-dimensional compositional data
    Liang, Wanfeng
    Wu, Yue
    Ma, Xiaoyan
    [J]. STATISTICS & PROBABILITY LETTERS, 2022, 184
  • [28] Robust and sparse k-means clustering for high-dimensional data
    Šárka Brodinová
    Peter Filzmoser
    Thomas Ortner
    Christian Breiteneder
    Maia Rohm
    [J]. Advances in Data Analysis and Classification, 2019, 13 : 905 - 932
  • [29] A stochastic variance-reduced coordinate descent algorithm for learning sparse Bayesian network from discrete high-dimensional data
    Shajoonnezhad, Nazanin
    Nikanjam, Amin
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (03) : 947 - 958
  • [30] Path Thresholding: Asymptotically Tuning-Free High-Dimensional Sparse Regression
    Vats, Divyanshu
    Baraniuk, Richard G.
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 948 - 957