Robust Variable Selection and Estimation Based on Kernel Modal Regression

被引:4
|
作者
Guo, Changying [1 ]
Song, Biqin [1 ]
Wang, Yingjie [1 ]
Chen, Hong [1 ]
Xiong, Huijuan [1 ]
机构
[1] Huazhong Agr Univ, Coll Sci, Wuhan 430070, Hubei, Peoples R China
基金
中国国家自然科学基金;
关键词
modal regression; maximum correntropy criterion; variable selection; reproducing kernel Hilbert space; generalization error; QUANTILE REGRESSION; CORRENTROPY; SIGNAL;
D O I
10.3390/e21040403
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Model-free variable selection has attracted increasing interest recently due to its flexibility in algorithmic design and outstanding performance in real-world applications. However, most of the existing statistical methods are formulated under the mean square error (MSE) criterion, and susceptible to non-Gaussian noise and outliers. As the MSE criterion requires the data to satisfy Gaussian noise condition, it potentially hampers the effectiveness of model-free methods in complex circumstances. To circumvent this issue, we present a new model-free variable selection algorithm by integrating kernel modal regression and gradient-based variable identification together. The derived modal regression estimator is related closely to information theoretic learning under the maximum correntropy criterion, and assures algorithmic robustness to complex noise by replacing learning of the conditional mean with the conditional mode. The gradient information of estimator offers a model-free metric to screen the key variables. In theory, we investigate the theoretical foundations of our new model on generalization-bound and variable selection consistency. In applications, the effectiveness of the proposed method is verified by data experiments.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Variable Selection for Varying Coefficient Models Via Kernel Based Regularized Rank Regression
    Kang-ning WANG
    Lu LIN
    Acta Mathematicae Applicatae Sinica, 2020, 36 (02) : 458 - 470
  • [42] Variable selection in partially linear additive models for modal regression
    Lv, Jing
    Yang, Hu
    Guo, Chaohui
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2017, 46 (07) : 5646 - 5665
  • [43] ROBUST VARIABLE SELECTION METHOD BASED ON HUBERIZED LARS-LASSO REGRESSION
    Uraibi, Hassan
    Midi, Habshah
    ECONOMIC COMPUTATION AND ECONOMIC CYBERNETICS STUDIES AND RESEARCH, 2020, 54 (03): : 145 - 160
  • [44] Robust variable selection for finite mixture regression models
    Qingguo Tang
    R. J. Karunamuni
    Annals of the Institute of Statistical Mathematics, 2018, 70 : 489 - 521
  • [45] A robust and efficient variable selection method for linear regression
    Yang, Zhuoran
    Fu, Liya
    Wang, You-Gan
    Dong, Zhixiong
    Jiang, Yunlu
    JOURNAL OF APPLIED STATISTICS, 2022, 49 (14) : 3677 - 3692
  • [46] Variable Selection Linear Regression for Robust Speech Recognition
    Tsao, Yu
    Hu, Ting-Yao
    Sakti, Sakriani
    Nakamura, Satoshi
    Lee, Lin-shan
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2014, E97D (06) : 1477 - 1487
  • [47] Robust nonnegative garrote variable selection in linear regression
    Gijbels, I.
    Vrinssen, I.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2015, 85 : 1 - 22
  • [48] Variable selection in robust regression models for longitudinal data
    Fan, Yali
    Qin, Guoyou
    Zhu, Zhongyi
    JOURNAL OF MULTIVARIATE ANALYSIS, 2012, 109 : 156 - 167
  • [49] Robust Bayesian nonparametric variable selection for linear regression
    Cabezas, Alberto
    Battiston, Marco
    Nemeth, Christopher
    STAT, 2024, 13 (02):
  • [50] Robust variable selection for mixture linear regression models
    Jiang, Yunlu
    HACETTEPE JOURNAL OF MATHEMATICS AND STATISTICS, 2016, 45 (02): : 549 - 559