Bayesian subset selection and variable importance for interpretable prediction and classification

被引:0
|
作者
Kowal, Daniel R. [1 ]
机构
[1] Rice Univ, Dept Stat, Houston, TX 77005 USA
基金
美国国家卫生研究院;
关键词
education; linear regression; logistic regression; model selection; penalized regression; REGRESSION; MODELS; CHOICE; LASSO;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Subset selection is a valuable tool for interpretable learning, scientific discovery, and data compression. However, classical subset selection is often avoided due to selection instabil-ity, lack of regularization, and difficulties with post-selection inference. We address these challenges from a Bayesian perspective. Given any Bayesian predictive model M, we ex-tract a family of near-optimal subsets of variables for linear prediction or classification. This strategy deemphasizes the role of a single "best" subset and instead advances the broader perspective that often many subsets are highly competitive. The acceptable family of subsets offers a new pathway for model interpretation and is neatly summarized by key members such as the smallest acceptable subset, along with new (co-) variable importance metrics based on whether variables (co-) appear in all, some, or no acceptable subsets. More broadly, we apply Bayesian decision analysis to derive the optimal linear coefficients for any subset of variables. These coefficients inherit both regularization and predictive uncertainty quantification via M. For both simulated and real data, the proposed approach exhibits better prediction, interval estimation, and variable selection than competing Bayesian and frequentist selection methods. These tools are applied to a large education dataset with highly correlated covariates. Our analysis provides unique insights into the combination of environmental, socioeconomic, and demographic factors that predict educational outcomes, and identifies over 200 distinct subsets of variables that offer near-optimal out-of-sample predictive accuracy.
引用
收藏
页数:38
相关论文
共 50 条
  • [41] BAYESIAN CONSTRAINED VARIABLE SELECTION
    Farcomeni, Alessio
    [J]. STATISTICA SINICA, 2010, 20 (03) : 1043 - 1062
  • [42] Classification of brain activation via spatial Bayesian variable selection in fMRI regression
    Stefanie Kalus
    Philipp G. Sämann
    Ludwig Fahrmeir
    [J]. Advances in Data Analysis and Classification, 2014, 8 : 63 - 83
  • [43] Bayesian Variable Selection in Neural Networks for Short-Term Meteorological Prediction
    Bruneau, Pierrick
    Boudet, Laurence
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2012, PT IV, 2012, 7666 : 289 - 296
  • [44] Accurate and Interpretable Bayesian MARS for Traffic Flow Prediction
    Xu, Yanyan
    Kong, Qing-Jie
    Klette, Reinhard
    Liu, Yuncai
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2014, 15 (06) : 2457 - 2469
  • [45] Gene selection: a Bayesian variable selection approach
    Lee, KE
    Sha, NJ
    Dougherty, ER
    Vannucci, M
    Mallick, BK
    [J]. BIOINFORMATICS, 2003, 19 (01) : 90 - 97
  • [46] On the selection consistency of Bayesian structured variable selection
    Yang, Kaixu
    Shen, Xiaoxi
    [J]. STAT, 2017, 6 (01): : 131 - 144
  • [47] A comparison of random forest variable selection methods for classification prediction modeling
    Speiser, Jaime Lynn
    Miller, Michael E.
    Tooze, Janet
    Ip, Edward
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2019, 134 : 93 - 101
  • [48] Prediction and Variable Selection in High-Dimensional Misspecified Binary Classification
    Furmanczyk, Konrad
    Rejchel, Wojciech
    [J]. ENTROPY, 2020, 22 (05)
  • [49] The Importance of Variable Selection for Neural Networks-Based Classification in an Industrial Context
    Cateni, Silvia
    Colla, Valentina
    [J]. ADVANCES IN NEURAL NETWORKS: COMPUTATIONAL INTELLIGENCE FOR ICT, 2016, 54 : 363 - 370