Independently Interpretable Lasso for Generalized Linear Models

被引:1
|
作者
Takada, Masaaki [1 ]
Suzuki, Taiji [2 ,3 ,4 ]
Fujisawa, Hironori [1 ,4 ,5 ]
机构
[1] SOKENDAI, Grad Univ Adv Studies, Tokyo 1908562, Japan
[2] Univ Tokyo, Tokyo 1050033, Japan
[3] Japan Sci & Technol Agcy, PRESTO, Kawaguchi, Saitama 3320012, Japan
[4] RIKEN, Ctr Adv Integrated Intelligence Res, Tokyo 1030027, Japan
[5] Inst Stat Math, Tokyo 1908562, Japan
关键词
VARIABLE SELECTION; BREAST-CANCER; REGRESSION; REGULARIZATION; PREDICTION; SPARSITY; RECOVERY; TUMOR;
D O I
10.1162/neco_a_01279
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse regularization such as l(1) regularization is a quite powerful and widely used strategy for high-dimensional learning problems. The effectiveness of sparse regularization has been supported practically and theoretically by several studies. However, one of the biggest issues in sparse regularization is that its performance is quite sensitive to correlations between features. Ordinary l(1) regularization selects variables correlated with each other under weak regularizations, which results in deterioration of not only its estimation error but also interpretability. In this letter, we propose a new regularization method, independently interpretable lasso (IILasso), for generalized linear models. Our proposed regularizer suppresses selecting correlated variables, so that each active variable affects the response independently in the model. Hence, we can interpret regression coefficients intuitively, and the performance is also improved by avoiding overfitting. We analyze the theoretical property of the IILasso and show that the proposed method is advantageous for its sign recovery and achieves almost minimax optimal convergence rate. Synthetic and real data analyses also indicate the effectiveness of the IILasso.
引用
收藏
页码:1168 / 1221
页数:54
相关论文
共 50 条
  • [31] Generalized linear models
    Burzykowski, Tomasz
    Geubbelmans, Melvin
    Rousseau, Axel-Jan
    Valkenborg, Dirk
    [J]. AMERICAN JOURNAL OF ORTHODONTICS AND DENTOFACIAL ORTHOPEDICS, 2023, 164 (04) : 604 - 606
  • [32] GENERALIZED LINEAR MODELS
    NELDER, JA
    WEDDERBURN, RW
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES A-GENERAL, 1972, 135 (03): : 370 - +
  • [33] Generalized linear models
    Zezula, Ivan
    [J]. BIOMETRIC METHODS AND MODELS IN CURRENT SCIENCE AND RESEARCH, 2011, : 39 - 58
  • [34] The generalized LASSO
    Roth, V
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (01): : 16 - 28
  • [35] c060: Extended Inference with Lasso and Elastic-Net Regularized Cox and Generalized Linear Models
    Sill, Martin
    Hielscher, Thomas
    Becker, Natalia
    Zucknick, Manuela
    [J]. JOURNAL OF STATISTICAL SOFTWARE, 2014, 62 (05): : 1 - 22
  • [36] Bayesian adaptive lasso with variational Bayes for variable selection in high-dimensional generalized linear mixed models
    Dao Thanh Tung
    Minh-Ngoc Tran
    Tran Manh Cuong
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2019, 48 (02) : 530 - 543
  • [37] Error bounds for the convex loss Lasso in linear models
    Hannay, Mark
    Deleamont, Pierre-Yves
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2017, 11 (02): : 2832 - 2875
  • [38] On the asymptotic properties of the group lasso estimator for linear models
    Nardi, Yuval
    Rinaldo, Alessandro
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2008, 2 : 605 - 633
  • [39] Interpretable Machine Learning Using Partial Linear Models
    Flachaire, Emmanuel
    Hue, Sullivan
    Laurent, Sebastien
    Hacheme, Gilles
    [J]. OXFORD BULLETIN OF ECONOMICS AND STATISTICS, 2024, 86 (03) : 519 - 540
  • [40] Moderately clipped LASSO for the high-dimensional generalized linear model
    Lee, Sangin
    Ku, Boncho
    Kown, Sunghoon
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2020, 27 (04) : 445 - 458