The Application of Bayesian Penalty Regression in Sparse Regularization

被引:0
|
作者
Chen, Weijie [1 ]
机构
[1] Northwest Univ, Xian 710127, Shaanxi, Peoples R China
关键词
regularization; shrinkage priors; Gibbs sampler; hierarchical models; convex penalty regression; non-convex penalty regression; VARIABLE SELECTION; HORSESHOE ESTIMATOR; SHRINKAGE; PRIORS;
D O I
10.1109/ICBASE51474.2020.00042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
High-dimensional sparse data is prone to overfitting problems when building regression models, and regularization is a classic and effective method, which includes ridge regression, Lasso regression, and elastic net. In the Bayesian framework, the penalty term is derived from specific shrinkage priors, and hierarchical models and the Gibbs sampler are used in the simulation. In this paper, we use the concrete slump test dataset to establish the regression model and apply the Bayesian convex penalty regression and non-convex penalty regression to screen the variables in the model. The result is compared to the classic regularization method by evaluating the model based on the prediction results, and finally, we explain the advantages of the Bayesian models over the ordinary regularization method.
引用
下载
收藏
页码:158 / 165
页数:8
相关论文
共 50 条
  • [1] Sparse Logistic Regression: Comparison of Regularization and Bayesian Implementations
    Zanon, Mattia
    Zambonin, Giuliano
    Susto, Gian Antonio
    McLoone, Sean
    ALGORITHMS, 2020, 13 (06)
  • [2] Sparse regularization with lq penalty term
    Grasmair, Markus
    Haltmeier, Markus
    Scherzer, Otmar
    INVERSE PROBLEMS, 2008, 24 (05)
  • [3] Regularization Parameter Selection for a Bayesian Group Sparse Multi-Task Regression Model with Application to Imaging Genomics
    Nathoo, Farouk S.
    Greenlaw, Keelin
    Lesperance, Mary
    2016 6TH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION IN NEUROIMAGING (PRNI), 2016, : 9 - 12
  • [4] Gene selection in cancer classification using sparse logistic regression with Bayesian regularization
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    BIOINFORMATICS, 2006, 22 (19) : 2348 - 2355
  • [5] Comment: Regularization via Bayesian Penalty Mixing
    George, Edward I.
    Rockova, Veronika
    TECHNOMETRICS, 2020, 62 (04) : 438 - 442
  • [6] Ordinal Regression with Sparse Bayesian
    Chang, Xiao
    Zheng, Qinghua
    Lin, Peng
    EMERGING INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2009, 5755 : 591 - 599
  • [7] Enhanced Sparse Regularization Based on Logarithm Penalty and Its Application to Gearbox Compound Fault Diagnosis
    Liao, Yi
    Huang, Weiguo
    Shen, Changqing
    Zhu, Zhongkui
    Xuan, Jianping
    Mao, Lingfeng
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2021, 70
  • [8] SPARSE REDUCED RANK REGRESSION WITH NONCONVEX REGULARIZATION
    Zhao, Ziping
    Palomar, Daniel P.
    2018 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2018, : 811 - 815
  • [9] BAYESIAN LINEAR REGRESSION WITH SPARSE PRIORS
    Castillo, Ismael
    Schmidt-Hieber, Johannes
    Van der Vaart, Aad
    ANNALS OF STATISTICS, 2015, 43 (05): : 1986 - 2018
  • [10] Incremental sparse Bayesian ordinal regression
    Li, Chang
    de Rijke, Maarten
    NEURAL NETWORKS, 2018, 106 : 294 - 302