An RKHS-based approach to double-penalized regression in high-dimensional partially linear models

被引:2
|
作者
Cui, Wenquan [1 ]
Cheng, Haoyang [1 ]
Sun, Jiajing [2 ]
机构
[1] Univ Sci & Technol China, Dept Stat & Finance, Sch Management, Hefei, Anhui, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing, Peoples R China
关键词
Eigen-analysis; High-dimensional data; Oracle property; Partially linear model; Representer theorem; Reproducing kernel Hilbert space; Sacks-Ylvisaker conditions; SCAD (smoothly clipped absolute deviation) penalty; GENERALIZED ADDITIVE-MODELS; VARIABLE SELECTION; DIVERGING NUMBER; SPARSE;
D O I
10.1016/j.jmva.2018.07.013
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study simultaneous variable selection and estimation in high-dimensional partially linear models under the assumption that the nonparametric component is from a reproducing kernel Hilbert space (RKHS) and that the vector of regression coefficients for the parametric component is sparse. A double penalty is used to deal with the problem. The estimate of the nonparametric component is subject to a roughness penalty based on the squared semi-norm on the RKHS, and a penalty with oracle properties is used to achieve sparsity in the parametric component. Under regularity conditions, we establish the consistency and rate of convergence of the parametric estimation together with the consistency of variable selection. The proposed estimators of the non-zero coefficients are also shown to have the asymptotic oracle property. Simulations and empirical studies illustrate the performance of the method. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:201 / 210
页数:10
相关论文
共 50 条
  • [1] SCAD-PENALIZED REGRESSION IN HIGH-DIMENSIONAL PARTIALLY LINEAR MODELS
    Xie, Huiliang
    Huang, Jian
    [J]. ANNALS OF STATISTICS, 2009, 37 (02): : 673 - 696
  • [2] Penalized least-squares estimation for regression coefficients in high-dimensional partially linear models
    Ni, Huey-Fan
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2012, 142 (02) : 379 - 389
  • [4] Modified Cross-Validation for Penalized High-Dimensional Linear Regression Models
    Yu, Yi
    Feng, Yang
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (04) : 1009 - 1027
  • [5] PENALIZED LINEAR REGRESSION WITH HIGH-DIMENSIONAL PAIRWISE SCREENING
    Gong, Siliang
    Zhang, Kai
    Liu, Yufeng
    [J]. STATISTICA SINICA, 2021, 31 (01) : 391 - 420
  • [6] A new test for high-dimensional regression coefficients in partially linear models
    Zhao, Fanrong
    Lin, Nan
    Zhang, Baoxue
    [J]. CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2023, 51 (01): : 5 - 18
  • [7] Efficient adaptive estimation strategies in high-dimensional partially linear regression models
    Gao, Xiaoli
    Ahmed, S. Ejaz
    [J]. PERSPECTIVES ON BIG DATA ANALYSIS: METHODOLOGIES AND APPLICATIONS, 2014, 622 : 61 - 80
  • [8] Double penalized variable selection for high-dimensional partial linear mixed effects models
    Yang, Yiping
    Luo, Chuanqin
    Yang, Weiming
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 204
  • [9] Penalized empirical likelihood for high-dimensional generalized linear models
    Chen, Xia
    Mao, Liyue
    [J]. STATISTICS AND ITS INTERFACE, 2021, 14 (02) : 83 - 94
  • [10] A dual-penalized approach to hypothesis testing in high-dimensional linear mediation models
    He, Chenxuan
    He, Yiran
    Xu, Wangli
    [J]. Computational Statistics and Data Analysis, 2025, 202