Nonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression models

被引:0
|
作者
Ning Li
Hu Yang
机构
[1] Chongqing University,College of Mathematics and Statistics
来源
Statistical Papers | 2021年 / 62卷
关键词
High-dimensional variable selection; Minimax concave penalty; Nonnegativity constraints; Oracle property;
D O I
暂无
中图分类号
学科分类号
摘要
As a promising alternative to the Lasso penalty, the minimax concave penalty (MCP) produces nearly unbiased shrinkage estimate. In this paper, we propose the nonnegative MCP estimator in the specific high-dimensional settings where the regression coefficients are imposed with the nonnegativity constraint, which is particularly relevant when modelling nonnegative data. We prove that the asymptotic theory for the nonnegative MCP estimator requires much weaker assumptions than those in the literature. In particular, we do not impose the restrictive distributional conditions on the random errors and do not assume the irrepresentable condition or its variants on the design matrix. For the nonnegative MCP estimator, despite its good asymptotic properties, the corresponding optimization problem is non-convex, and consequently much hard to solve. Thus, we develop an efficient computation algorithm, which is a coupling of the difference convex algorithm with the multiplicative updates algorithm, for the implementation of the nonnegative MCP estimator. Simulation studies are carried out to examine superior performance of the nonnegative MCP estimator against alternative methods and a real data analysis for index tracking is also presented for illustration.
引用
收藏
页码:661 / 680
页数:19
相关论文
共 50 条
  • [41] Robust Adaptive Lasso method for parameter's estimation and variable selection in high-dimensional sparse models
    Wahid, Abdul
    Khan, Dost Muhammad
    Hussain, Ijaz
    PLOS ONE, 2017, 12 (08):
  • [42] Estimation of linear projections of non-sparse coefficients in high-dimensional regression
    Azriel, David
    Schwartzman, Armin
    ELECTRONIC JOURNAL OF STATISTICS, 2020, 14 (01): : 174 - 206
  • [43] Variable selection and estimation for high-dimensional partially linear spatial autoregressive models with measurement errors
    Huang, Zhensheng
    Meng, Shuyu
    Zhang, Linlin
    STATISTICS AND ITS INTERFACE, 2024, 17 (04) : 681 - 697
  • [44] A One Covariate at a Time, Multiple Testing Approach to Variable Selection in High-Dimensional Linear Regression Models
    Chudik, A.
    Kapetanios, G.
    Pesaran, M. Hashem
    ECONOMETRICA, 2018, 86 (04) : 1479 - 1512
  • [45] Transfer learning for sparse variable selection in high-dimensional regression from quadratic measurement
    Shang, Qingxu
    Li, Jie
    Song, Yunquan
    KNOWLEDGE-BASED SYSTEMS, 2024, 300
  • [46] Variable Selection in High-Dimensional Partially Linear Models with Longitudinal Data
    Yang Yiping
    Xue Liugen
    RECENT ADVANCE IN STATISTICS APPLICATION AND RELATED AREAS, VOLS I AND II, 2009, : 661 - 667
  • [47] A consistent variable selection criterion for linear models with high-dimensional covariates
    Zheng, XD
    Loh, WY
    STATISTICA SINICA, 1997, 7 (02) : 311 - 325
  • [48] Feature Screening for High-Dimensional Variable Selection in Generalized Linear Models
    Jiang, Jinzhu
    Shang, Junfeng
    ENTROPY, 2023, 25 (06)
  • [49] Semiparametric efficient estimation in high-dimensional partial linear regression models
    Fu, Xinyu
    Huang, Mian
    Yao, Weixin
    SCANDINAVIAN JOURNAL OF STATISTICS, 2024, 51 (03) : 1259 - 1287
  • [50] Robust and consistent variable selection in high-dimensional generalized linear models
    Avella-Medina, Marco
    Ronchetti, Elvezio
    BIOMETRIKA, 2018, 105 (01) : 31 - 44