Approximate Post-Selective Inference for Regression with the Group LASSO

被引:0
|
作者
Panigrahi, Snigdha [1 ]
MacDonald, Peter W. [1 ]
Kessler, Daniel [2 ]
机构
[1] Univ Michigan, Dept Stat, Ann Arbor, MI 48109 USA
[2] Univ Michigan, Dept Psychiat, Ann Arbor, MI 48109 USA
基金
美国国家科学基金会;
关键词
Conditional inference; Group sparsity; Group LASSO; Laplace approxima; tion; Selective inference;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
After selection with the Group LASSO (or generalized variants such as the overlapping, sparse, or standardized Group LASSO), inference for the selected parameters is unreliable in the absence of adjustments for selection bias. In the penalized Gaussian regression setup, existing approaches provide adjustments for selection events that can be expressed as linear inequalities in the data variables. Such a representation, however, fails to hold for selection with the Group LASSO and substantially obstructs the scope of subsequent post-selective inference. Key questions of inferential interest-for example, inference for the effects of selected variables on the outcome-remain unanswered. In the present paper, we develop a consistent, post-selective, Bayesian method to address the existing gaps by deriving a likelihood adjustment factor and an approximation thereof that eliminates bias from the selection of groups. Experiments on simulated data and data from the Human Connectome Project demonstrate that our method recovers the effects of parameters within the selected groups while paying only a small price for bias adjustment.
引用
收藏
页数:49
相关论文
共 50 条
  • [31] Selective inference with unknown variance via the square-root lasso
    Tian, Xiaoying
    Loftus, Joshua R.
    Taylor, Jonathan E.
    BIOMETRIKA, 2018, 105 (04) : 755 - 768
  • [32] More Powerful Conditional Selective Inference for Generalized Lasso by Parametric Programming
    Le Duy, Vo Nguyen
    Takeuchi, Ichiro
    Journal of Machine Learning Research, 2022, 23
  • [33] Sparse group lasso for multiclass functional logistic regression models
    Matsui, Hidetoshi
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2019, 48 (06) : 1784 - 1797
  • [34] Parametric Programming Approach for More Powerful and General Lasso Selective Inference
    Vo Nguyen Le Duy
    Takeuchi, Ichiro
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [35] More Powerful Conditional Selective Inference for Generalized Lasso by Parametric Programming
    Duy, Vo Nguyen Le
    Takeuchi, Ichiro
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [36] Adaptively weighted group Lasso for semiparametric quantile regression models
    Honda, Toshio
    Ing, Ching-Kang
    Wu, Wei-Ying
    BERNOULLI, 2019, 25 (4B) : 3311 - 3338
  • [37] BAYESIAN GROUP LASSO TOBIT REGRESSION WITH AN UPPER CONSTRAINT AT ZERO
    Al-rubaye, Ali Abdulmohsin Abdulraeem
    Alhseeni, Ameer Musa Imran
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2021, 17 : 1373 - 1380
  • [38] Modeling Carbon Emissions of Post-Selective Logging in the Production Forests of Ulu Jelai, Pahang, Malaysia
    Saad, Siti Nor Maizah
    Jaafar, Wan Shafrina Wan Mohd
    Omar, Hamdan
    Maulud, Khairul Nizam Abdul
    Kamarulzaman, Aisyah Marliza Muhmad
    Adrah, Esmaeel
    Ghazali, Norzalyta Mohd
    Mohan, Midhun
    REMOTE SENSING, 2023, 15 (04)
  • [39] Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure
    Li, Yanming
    Nan, Bin
    Zhu, Ji
    BIOMETRICS, 2015, 71 (02) : 354 - 363
  • [40] Evaluating methods for Lasso selective inference in biomedical research: a comparative simulation study
    Kammer, Michael
    Dunkler, Daniela
    Michiels, Stefan
    Heinze, Georg
    BMC MEDICAL RESEARCH METHODOLOGY, 2022, 22 (01)