Bayesian L1/2 Regression

被引:0
|
作者
Ke, Xiongwen [1 ,2 ]
Fan, Yanan [2 ,3 ]
机构
[1] Cent South Univ, Sch Math & Stat, Changsha, Hunan, Peoples R China
[2] UNSW, Sch Math & Stat, Sydney 2052, Australia
[3] CSIRO, Data61, Sydney, Australia
关键词
Bridge shrinkage; High dimensional regression; MCMC; Sparse optimization; NONCONCAVE PENALIZED LIKELIHOOD; COLLAPSED GIBBS SAMPLERS; ASYMPTOTIC PROPERTIES; ANTIFREEZE PROTEIN; VARIABLE-SELECTION; LINEAR-REGRESSION;
D O I
10.1080/10618600.2024.2374579
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is well known that Bridge regression enjoys superior theoretical properties when compared to traditional LASSO. However, the current latent variable representation of its Bayesian counterpart, based on the exponential power prior, is computationally expensive in higher dimensions. In this article, we show that the exponential power prior has a closed form scale mixture of normal decomposition for alpha=(1/2)(gamma),gamma is an element of{1,2,& mldr;} . We call these types of priors L-1/2 prior for short. We develop an efficient partially collapsed Gibbs sampling scheme for computation using the L-1/2 prior and study theoretical properties when p>n . In addition, we introduce a non-separable Bridge penalty function inspired by the fully Bayesian formulation and a novel, efficient coordinate descent algorithm. We prove the algorithm's convergence and show that the local minimizer from our optimization algorithm has an oracle property. Finally, simulation studies were carried out to illustrate the performance of the new algorithms. Supplementary materials for this article are available online.
引用
收藏
页码:199 / 210
页数:12
相关论文
共 50 条
  • [1] Bayesian tobit quantile regression with L1/2 penalty
    Alhamzawi, Rahim
    Ali, Haithem Taha Mohammad
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2018, 47 (06) : 1739 - 1750
  • [2] Bayesian relative composite quantile regression approach of ordinal latent regression model with L1/2 regularization
    Yu-Zhu, Tian
    Chun-Ho, Wu
    Ling-Nan, Tai
    Zhi-Bao, Mian
    Mao-Zai, Tian
    STATISTICAL ANALYSIS AND DATA MINING, 2024, 17 (02)
  • [3] L1/2 regularized Logistic regression
    Zhao, Q. (zhao.qian@stu.xjtu.edu.cn), 1600, Science Press (25):
  • [4] Correction to: Bayesian joint inference for multivariate quantile regression model with L1/2 penalty
    Yu-Zhu Tian
    Man-Lai Tang
    Mao-Zai Tian
    Computational Statistics, 2021, 36 : 2995 - 2995
  • [5] On L1 regression coefficients
    Hong, Chong Sun
    Choi, Hyun Jip
    Communications in Statistics Part B: Simulation and Computation, 1997, 26 (02): : 531 - 537
  • [6] A Survey of L1 Regression
    Vidaurre, Diego
    Bielza, Concha
    Larranaga, Pedro
    INTERNATIONAL STATISTICAL REVIEW, 2013, 81 (03) : 361 - 387
  • [7] Fully Bayesian L1/2-penalized linear quantile regression analysis with autoregressive errors
    Tian, Yuzhu
    Song, Xinyuan
    STATISTICS AND ITS INTERFACE, 2020, 13 (03) : 271 - 286
  • [8] Sparse Discriminant Analysis Based on the Bayesian Posterior Probability Obtained by L1 Regression
    Hidaka, Akinori
    Kurita, Takio
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, 2012, 7626 : 648 - 656
  • [9] Influence measure for the L1 regression
    Elian, SN
    André, CDS
    Narula, SC
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2000, 29 (04) : 837 - 849
  • [10] LEVERAGE AND BREAKDOWN IN L1 REGRESSION
    ELLIS, SP
    MORGENTHALER, S
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1992, 87 (417) : 143 - 148