Convex vs Non-Convex Estimators for Regression and Sparse Estimation: the Mean Squared Error Properties of ARD and GLasso

被引:0
|
作者
Aravkin, Aleksandr [1 ]
Burke, James V. [2 ]
Chiuso, Alessandro [3 ]
Pillonetto, Gianluigi [3 ]
机构
[1] IBM Corp, TJ Watson Res Ctr, Yorktown Hts, NY 10598 USA
[2] Univ Washington, Dept Math, Seattle, WA 98195 USA
[3] Univ Padua, Dept Informat Engn, Padua, Italy
关键词
Lasso; Group Lasso; Multiple Kernel Learning; Bayesian regularization; marginal likelihood; MODEL SELECTION; CONSISTENCY; LASSO;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We study a simple linear regression problem for grouped variables; we are interested in methods which jointly perform estimation and variable selection, that is, that automatically set to zero groups of variables in the regression vector. The Group Lasso (GLasso), a well known approach used to tackle this problem which is also a special case of Multiple Kernel Learning (MKL), boils down to solving convex optimization problems. On the other hand, a Bayesian approach commonly known as Sparse Bayesian Learning (SBL), a version of which is the well known Automatic Relevance Determination (ARD), lead to non-convex problems. In this paper we discuss the relation between ARD (and a penalized version which we call PARD) and Glasso, and study their asymptotic properties in terms of the Mean Squared Error in estimating the unknown parameter. The theoretical arguments developed here are independent of the correctness of the prior models and clarify the advantages of PARD over GLasso.
引用
收藏
页码:217 / 252
页数:36
相关论文
共 26 条
  • [2] Relaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regression
    Pan, Zheng
    Zhang, Changshui
    PATTERN RECOGNITION, 2015, 48 (01) : 231 - 243
  • [3] GENERALIZED MEAN SQUARED ERROR PROPERTIES OF REGRESSION ESTIMATORS
    GUNST, RF
    MASON, RL
    COMMUNICATIONS IN STATISTICS PART A-THEORY AND METHODS, 1976, A 5 (15): : 1501 - 1508
  • [4] Distributed Quantile Regression with Non-Convex Sparse Penalties
    Mirzaeifard, Reza
    Gogineni, Vinay Chakravarthi
    Venkategowda, Naveen K. D.
    Werner, Stefan
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 250 - 254
  • [5] NON-CONVEX OPTIMIZATION FOR SPARSE INTERFEROMETRIC PHASE ESTIMATION
    Chemudupati, Satvik
    Pokala, Praveen Kumar
    Seelamantula, Chandra Sekhar
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2885 - 2889
  • [6] Projected Gradient Descent for Non-Convex Sparse Spike Estimation
    Traonmilin, Yann
    Aujol, Jean-Francois
    Leclaire, Arthur
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1110 - 1114
  • [7] Estimation of sparse covariance matrix via non-convex regularization
    Wang, Xin
    Kong, Lingchen
    Wang, Liqun
    JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 202
  • [8] ADMM for Sparse-Penalized Quantile Regression with Non-Convex Penalties
    Mirzaeifard, Reza
    Venkategowda, Naveen K. D.
    Gogineni, Vinay Chakravarthi
    Werner, Stefan
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 2046 - 2050
  • [9] Outlier-Robust Sparse Estimation via Non-Convex Optimization
    Cheng, Yu
    Diakonikolas, Ilias
    Ge, Rong
    Gupta, Shivam
    Kane, Daniel M.
    Soltanolkotabi, Mahdi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,