Nonlinear dimension reduction for conditional quantiles

被引:1
|
作者
Christou, Eliana [1 ]
Settle, Annabel [1 ]
Artemiou, Andreas [2 ]
机构
[1] Univ North Carolina Charlotte, Dept Math & Stat, 9201 Univ City Blvd, Charlotte, NC 28223 USA
[2] Cardiff Univ, Sch Math, Cardiff CF10 3AT, Wales
关键词
Classification; Dimension reduction; Quantile regression; Reproducing kernel Hilbert space; Visualization;
D O I
10.1007/s11634-021-00439-6
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In practice, data often display heteroscedasticity, making quantile regression (QR) a more appropriate methodology. Modeling the data, while maintaining a flexible nonparametric fitting, requires smoothing over a high-dimensional space which might not be feasible when the number of the predictor variables is large. This problem makes necessary the use of dimension reduction techniques for conditional quantiles, which focus on extracting linear combinations of the predictor variables without losing any information about the conditional quantile. However, nonlinear features can achieve greater dimension reduction. We, therefore, present the first nonlinear extension of the linear algorithm for estimating the central quantile subspace (CQS) using kernel data. First, we describe the feature CQS within the framework of reproducing kernel Hilbert space, and second, we illustrate its performance through simulation examples and real data applications. Specifically, we emphasize on visualizing various aspects of the data structure using the first two feature extractors, and we highlight the ability to combine the proposed algorithm with classification and regression linear algorithms. The results show that the feature CQS is an effective kernel tool for performing nonlinear dimension reduction for conditional quantiles.
引用
收藏
页码:937 / 956
页数:20
相关论文
共 50 条
  • [21] Efficient estimation of conditional covariance matrices for dimension reduction
    Da Veiga, Sebastien
    Loubes, Jean-Michel
    Solis, Maikol
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2017, 46 (09) : 4403 - 4424
  • [22] Dimension reduction for the conditional mean in regressions with categorical predictors
    Li, B
    Cook, RD
    Chiaromonte, F
    ANNALS OF STATISTICS, 2003, 31 (05): : 1636 - 1668
  • [23] Approximating Conditional Density Functions Using Dimension Reduction
    Fan, Jian-qing
    Peng, Liang
    Yao, Qi-wei
    Zhang, Wen-yang
    ACTA MATHEMATICAE APPLICATAE SINICA-ENGLISH SERIES, 2009, 25 (03): : 445 - 456
  • [24] A dimension reduction approach for conditional Kaplan–Meier estimators
    Weiyu Li
    Valentin Patilea
    TEST, 2018, 27 : 295 - 315
  • [25] Response dimension reduction for the conditional mean in multivariate regression
    Yoo, Jae Keun
    Cook, R. Dennis
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2008, 53 (02) : 334 - 343
  • [27] The ensemble conditional variance estimator for sufficient dimension reduction
    Fertl, Lukas
    Bura, Efstathia
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 1595 - 1634
  • [28] Conditional mean dimension reduction for tensor time series
    Lee, Chung Eun
    Zhang, Xin
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2024, 199
  • [29] Approximating conditional density functions using dimension reduction
    Jian-qing Fan
    Liang Peng
    Qi-wei Yao
    Wen-yang Zhang
    Acta Mathematicae Applicatae Sinica, English Series, 2009, 25 : 445 - 456
  • [30] Partial Dynamic Dimension Reduction for Conditional Mean in Regression
    GAN Shengjin
    YU Zhou
    JournalofSystemsScience&Complexity, 2020, 33 (05) : 1585 - 1601