Simultaneous estimation and factor selection in quantile regression via adaptive sup-norm regularization

被引:16
|
作者
Bang, Sungwan [1 ]
Jhun, Myoungshic [1 ]
机构
[1] Korea Univ, Dept Stat, Seoul 136701, South Korea
基金
新加坡国家研究基金会;
关键词
Factor selection; Linear programming; Quantile regression; Regularization; Sup-norm; NONCONCAVE PENALIZED LIKELIHOOD; VARIABLE SELECTION; MODEL SELECTION; ASYMPTOTICS; SHRINKAGE; SURVIVAL; LASSO;
D O I
10.1016/j.csda.2011.01.026
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Some regularization methods, including the group lasso and the adaptive group lasso, have been developed for the automatic selection of grouped variables (factors) in conditional mean regression. In many practical situations, such a problem arises naturally when a set of dummy variables is used to represent a categorical factor and/or when a set of basis functions of a continuous variable is included in the predictor set. Complementary to these earlier works, the simultaneous and automatic factor selection is examined in quantile regression. To incorporate the factor information into regularized model fitting, the adaptive sup-norm regularized quantile regression is proposed, which penalizes the empirical check loss function by the sum of factor-wise adaptive sup-norm penalties. It is shown that the proposed method possesses the oracle property. A simulation study demonstrates that the proposed method is a more appropriate tool for factor selection than the adaptive lasso regularized quantile regression. (C) 2011 Elsevier B.V. All rights reserved.
引用
下载
收藏
页码:813 / 826
页数:14
相关论文
共 50 条
  • [41] Bayesian variable selection and estimation in quantile regression using a quantile-specific prior
    Dao, Mai
    Wang, Min
    Ghosh, Souparno
    Ye, Keying
    COMPUTATIONAL STATISTICS, 2022, 37 (03) : 1339 - 1368
  • [42] Explicit Estimates on the Torus for the Sup-norm and the Crest Factor of Solutions of the Modified Kuramoto–Sivashinky Equation in One and Two Space Dimensions
    Michele V. Bartuccelli
    Jonathan H. Deane
    Guido Gentile
    Journal of Dynamics and Differential Equations, 2020, 32 : 791 - 807
  • [43] Simultaneous estimation and variable selection for a non-crossing multiple quantile regression using deep neural networks
    Shin, Jungmin
    Gwak, Seunghyun
    Shin, Seung Jun
    Bang, Sungwan
    STATISTICS AND COMPUTING, 2024, 34 (03)
  • [44] Variable selection via composite quantile regression with dependent errors
    Tang, Yanlin
    Song, Xinyuan
    Zhu, Zhongyi
    STATISTICA NEERLANDICA, 2015, 69 (01) : 1 - 20
  • [45] Heteroscedasticity identification and variable selection via multiple quantile regression
    Wang, Mingqiu
    Kang, Xiaoning
    Liang, Jiajuan
    Wang, Kun
    Wu, Yuanshan
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2024, 94 (02) : 297 - 314
  • [46] Graphical models via joint quantile regression with component selection
    Chun, Hyonho
    Lee, Myung Hee
    Fleet, James C.
    Oh, Ji Hwan
    JOURNAL OF MULTIVARIATE ANALYSIS, 2016, 152 : 162 - 171
  • [47] SPARSE MINIMUM AVERAGE VARIANCE ESTIMATION VIA QUANTILE REGRESSION GROUP VARIABLE SELECTION PENALTIES (GLQMAVE)
    Ahmed, Rihab
    Rodeen, Waleed
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2021, 17 : 2395 - 2401
  • [48] Differentially Expressed Genes Selection via Truncated Nuclear Norm Regularization
    Wang, Ya-Xuan
    Liu, Jin-Xing
    Gao, Ying-Lian
    Kong, Xiang-Zhen
    Zheng, Chun-Hou
    Du, Yong
    2016 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2016, : 1851 - 1855
  • [49] Variable selection in composite quantile regression models with adaptive group lasso
    Zhou, Xiaoshuang
    International Journal of Applied Mathematics and Statistics, 2013, 45 (15): : 12 - 19
  • [50] Variance estimation in censored quantile regression via induced smoothing
    Pang, Lei
    Lu, Wenbin
    Wang, Huixia Judy
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (04) : 785 - 796