An RKHS-based approach to double-penalized regression in high-dimensional partially linear models

被引:2
|
作者
Cui, Wenquan [1 ]
Cheng, Haoyang [1 ]
Sun, Jiajing [2 ]
机构
[1] Univ Sci & Technol China, Dept Stat & Finance, Sch Management, Hefei, Anhui, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing, Peoples R China
关键词
Eigen-analysis; High-dimensional data; Oracle property; Partially linear model; Representer theorem; Reproducing kernel Hilbert space; Sacks-Ylvisaker conditions; SCAD (smoothly clipped absolute deviation) penalty; GENERALIZED ADDITIVE-MODELS; VARIABLE SELECTION; DIVERGING NUMBER; SPARSE;
D O I
10.1016/j.jmva.2018.07.013
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study simultaneous variable selection and estimation in high-dimensional partially linear models under the assumption that the nonparametric component is from a reproducing kernel Hilbert space (RKHS) and that the vector of regression coefficients for the parametric component is sparse. A double penalty is used to deal with the problem. The estimate of the nonparametric component is subject to a roughness penalty based on the squared semi-norm on the RKHS, and a penalty with oracle properties is used to achieve sparsity in the parametric component. Under regularity conditions, we establish the consistency and rate of convergence of the parametric estimation together with the consistency of variable selection. The proposed estimators of the non-zero coefficients are also shown to have the asymptotic oracle property. Simulations and empirical studies illustrate the performance of the method. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:201 / 210
页数:10
相关论文
共 50 条
  • [31] Accounting for grouped predictor variables or pathways in high-dimensional penalized Cox regression models
    Shaima Belhechmi
    Riccardo De Bin
    Federico Rotolo
    Stefan Michiels
    [J]. BMC Bioinformatics, 21
  • [32] Accounting for grouped predictor variables or pathways in high-dimensional penalized Cox regression models
    Belhechmi, Shaima
    De Bin, Riccardo
    Rotolo, Federico
    Michiels, Stefan
    [J]. BMC BIOINFORMATICS, 2020, 21 (01)
  • [33] Shrinkage Ridge Regression Estimators in High-Dimensional Linear Models
    Yuzbasi, Bahadir
    Ahmed, S. Ejaz
    [J]. PROCEEDINGS OF THE NINTH INTERNATIONAL CONFERENCE ON MANAGEMENT SCIENCE AND ENGINEERING MANAGEMENT, 2015, 362 : 793 - 807
  • [34] A comparison study of Bayesian high-dimensional linear regression models
    Shin, Ju-Won
    Lee, Kyoungjae
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2021, 34 (03) : 491 - 505
  • [35] Test for high dimensional regression coefficients of partially linear models
    Wang, Siyang
    Cui, Hengjian
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2020, 49 (17) : 4091 - 4116
  • [36] Tests for regression coefficients in high dimensional partially linear models
    Liu, Yan
    Zhang, Sanguo
    Ma, Shuangge
    Zhang, Qingzhao
    [J]. STATISTICS & PROBABILITY LETTERS, 2020, 163
  • [37] Penalized empirical likelihood for high-dimensional partially linear varying coefficient model with measurement errors
    Fan, Guo-Liang
    Liang, Han-Ying
    Shen, Yu
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2016, 147 : 183 - 201
  • [38] Robust Variable Selection Based on Penalized Composite Quantile Regression for High-Dimensional Single-Index Models
    Song, Yunquan
    Li, Zitong
    Fang, Minglu
    [J]. MATHEMATICS, 2022, 10 (12)
  • [39] Variable Selection in High-Dimensional Partially Linear Models with Longitudinal Data
    Yang Yiping
    Xue Liugen
    [J]. RECENT ADVANCE IN STATISTICS APPLICATION AND RELATED AREAS, VOLS I AND II, 2009, : 661 - 667
  • [40] Vanishing deviance problem in high-dimensional penalized Cox regression
    Yao, Sijie
    Li, Tingyi
    Cao, Biwei
    Wang, Xuefeng
    [J]. CANCER RESEARCH, 2023, 83 (07)