Robust Variable Selection and Estimation Based on Kernel Modal Regression

被引:4
|
作者
Guo, Changying [1 ]
Song, Biqin [1 ]
Wang, Yingjie [1 ]
Chen, Hong [1 ]
Xiong, Huijuan [1 ]
机构
[1] Huazhong Agr Univ, Coll Sci, Wuhan 430070, Hubei, Peoples R China
基金
中国国家自然科学基金;
关键词
modal regression; maximum correntropy criterion; variable selection; reproducing kernel Hilbert space; generalization error; QUANTILE REGRESSION; CORRENTROPY; SIGNAL;
D O I
10.3390/e21040403
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Model-free variable selection has attracted increasing interest recently due to its flexibility in algorithmic design and outstanding performance in real-world applications. However, most of the existing statistical methods are formulated under the mean square error (MSE) criterion, and susceptible to non-Gaussian noise and outliers. As the MSE criterion requires the data to satisfy Gaussian noise condition, it potentially hampers the effectiveness of model-free methods in complex circumstances. To circumvent this issue, we present a new model-free variable selection algorithm by integrating kernel modal regression and gradient-based variable identification together. The derived modal regression estimator is related closely to information theoretic learning under the maximum correntropy criterion, and assures algorithmic robustness to complex noise by replacing learning of the conditional mean with the conditional mode. The gradient information of estimator offers a model-free metric to screen the key variables. In theory, we investigate the theoretical foundations of our new model on generalization-bound and variable selection consistency. In applications, the effectiveness of the proposed method is verified by data experiments.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Robust distributed estimation and variable selection for massive datasets via rank regression
    Jiaming Luan
    Hongwei Wang
    Kangning Wang
    Benle Zhang
    Annals of the Institute of Statistical Mathematics, 2022, 74 : 435 - 450
  • [22] Robust distributed estimation and variable selection for massive datasets via rank regression
    Luan, Jiaming
    Wang, Hongwei
    Wang, Kangning
    Zhang, Benle
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2022, 74 (03) : 435 - 450
  • [23] Robust regression estimation and variable selection when cellwise and casewise outliers are present
    Toka, Onur
    Cetin, Meral
    Arslan, Olcay
    HACETTEPE JOURNAL OF MATHEMATICS AND STATISTICS, 2021, 50 (01): : 289 - 303
  • [24] Pseudo estimation and variable selection in regression
    Wu, Wenbo
    Yin, Xiangrong
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2020, 208 : 25 - 35
  • [25] Resampling methods for variable selection in robust regression
    Wisnowski, JW
    Simpson, JR
    Montgomery, DC
    Runger, GC
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2003, 43 (03) : 341 - 355
  • [26] ROBUST CRITERION FOR VARIABLE SELECTION IN LINEAR REGRESSION
    Patil, A. B.
    Kashid, D. N.
    INTERNATIONAL JOURNAL OF AGRICULTURAL AND STATISTICAL SCIENCES, 2009, 5 (02): : 509 - 521
  • [27] Robust variable selection in the logistic regression model
    Jiang, Yunlu
    Zhang, Jiantao
    Huang, Yingqiang
    Zou, Hang
    Huang, Meilan
    Chen, Fanhong
    HACETTEPE JOURNAL OF MATHEMATICS AND STATISTICS, 2021, 50 (05): : 1572 - 1582
  • [28] Variable Selection in Kernel Regression Using Measurement Error Selection Likelihoods
    White, Kyle R.
    Stefanski, Leonard A.
    Wu, Yichao
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2017, 112 (520) : 1587 - 1597
  • [29] ON ROBUST KERNEL ESTIMATION OF DERIVATIVES OF REGRESSION-FUNCTIONS
    HARDLE, W
    GASSER, T
    SCANDINAVIAN JOURNAL OF STATISTICS, 1985, 12 (03) : 233 - 240
  • [30] Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
    Arslan, Olcay
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (06) : 1952 - 1965