Independently Interpretable Lasso for Generalized Linear Models

被引:1
|
作者
Takada, Masaaki [1 ]
Suzuki, Taiji [2 ,3 ,4 ]
Fujisawa, Hironori [1 ,4 ,5 ]
机构
[1] SOKENDAI, Grad Univ Adv Studies, Tokyo 1908562, Japan
[2] Univ Tokyo, Tokyo 1050033, Japan
[3] Japan Sci & Technol Agcy, PRESTO, Kawaguchi, Saitama 3320012, Japan
[4] RIKEN, Ctr Adv Integrated Intelligence Res, Tokyo 1030027, Japan
[5] Inst Stat Math, Tokyo 1908562, Japan
关键词
VARIABLE SELECTION; BREAST-CANCER; REGRESSION; REGULARIZATION; PREDICTION; SPARSITY; RECOVERY; TUMOR;
D O I
10.1162/neco_a_01279
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse regularization such as l(1) regularization is a quite powerful and widely used strategy for high-dimensional learning problems. The effectiveness of sparse regularization has been supported practically and theoretically by several studies. However, one of the biggest issues in sparse regularization is that its performance is quite sensitive to correlations between features. Ordinary l(1) regularization selects variables correlated with each other under weak regularizations, which results in deterioration of not only its estimation error but also interpretability. In this letter, we propose a new regularization method, independently interpretable lasso (IILasso), for generalized linear models. Our proposed regularizer suppresses selecting correlated variables, so that each active variable affects the response independently in the model. Hence, we can interpret regression coefficients intuitively, and the performance is also improved by avoiding overfitting. We analyze the theoretical property of the IILasso and show that the proposed method is advantageous for its sign recovery and achieves almost minimax optimal convergence rate. Synthetic and real data analyses also indicate the effectiveness of the IILasso.
引用
收藏
页码:1168 / 1221
页数:54
相关论文
共 50 条
  • [1] AIC for the Lasso in generalized linear models
    Ninomiya, Yoshiyuki
    Kawano, Shuichi
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2016, 10 (02): : 2537 - 2560
  • [2] AIC for the group Lasso in generalized linear models
    Komatsu, Satoshi
    Yamashita, Yuta
    Ninomiya, Yoshiyuki
    [J]. JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2019, 2 (02) : 545 - 558
  • [3] AIC for the group Lasso in generalized linear models
    Satoshi Komatsu
    Yuta Yamashita
    Yoshiyuki Ninomiya
    [J]. Japanese Journal of Statistics and Data Science, 2019, 2 : 545 - 558
  • [4] Generalized fused Lasso for grouped data in generalized linear models
    Ohishi, Mineaki
    [J]. STATISTICS AND COMPUTING, 2024, 34 (04)
  • [5] High-dimensional generalized linear models and the lasso
    van de Geer, Sara A.
    [J]. ANNALS OF STATISTICS, 2008, 36 (02): : 614 - 645
  • [6] Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models
    Lichun Wang
    Yuan You
    Heng Lian
    [J]. Statistical Papers, 2015, 56 : 819 - 828
  • [7] Convergence and sparsity of Lasso and group Lasso in high-dimensional generalized linear models
    Wang, Lichun
    You, Yuan
    Lian, Heng
    [J]. STATISTICAL PAPERS, 2015, 56 (03) : 819 - 828
  • [8] Debiased lasso for generalized linear models with a diverging number of covariates
    Xia, Lu
    Nan, Bin
    Li, Yi
    [J]. BIOMETRICS, 2023, 79 (01) : 344 - 357
  • [9] Adaptive Lasso for generalized linear models with a diverging number of parameters
    Cui, Yan
    Chen, Xia
    Yan, Li
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (23) : 11826 - 11842
  • [10] Adaptive Lasso estimators for ultrahigh dimensional generalized linear models
    Wang, Mingqiu
    Wang, Xiuli
    [J]. STATISTICS & PROBABILITY LETTERS, 2014, 89 : 41 - 50