Extending the Scope of Inverse Regression Methods in Sufficient Dimension Reduction

被引:2
|
作者
Zhu, Li-Ping [1 ,2 ]
机构
[1] E China Normal Univ, Sch Finance & Stat, Shanghai 200062, Peoples R China
[2] E China Normal Univ, Ctr Int Finance & Risk Management, Shanghai 200062, Peoples R China
基金
中国国家自然科学基金;
关键词
Ellipticity; Inverse regression; Linearity condition; Sliced inverse regression; Sufficient dimension reduction; ASYMPTOTICS;
D O I
10.1080/03610920903350531
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In the area of sufficient dimension reduction, two structural conditions are often assumed: the linearity condition that is close to assuming ellipticity of underlying distribution of predictors, and the constant variance condition that nears multivariate normality assumption of predictors. Imposing these conditions are considered as necessary trade-off for overcoming the ocurse of dimensionalityo. However, it is very hard to check whether these conditions hold or not. When these conditions are violated, some methods such as marginal transformation and re-weighting are suggested so that data fulfill them approximately. In this article, we assume an independence condition between the projected predictors and their orthogonal complements which can ensure the commonly used inverse regression methods to identify the central subspace of interest. The independence condition can be checked by the gridded chi-square test. Thus, we extend the scope of many inverse regression methods and broaden their applicability in the literature. Simulation studies and an application to the car price data are presented for illustration.
引用
收藏
页码:84 / 95
页数:12
相关论文
共 50 条
  • [41] Principal weighted logistic regression for sufficient dimension reduction in binary classification
    Kim, Boyoung
    Shin, Seung Jun
    [J]. JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2019, 48 (02) : 194 - 206
  • [42] Sufficient dimension reduction for the conditional mean with a categorical predictor in multivariate regression
    Yoo, Jae Keun
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2008, 99 (08) : 1825 - 1839
  • [43] Bayesian inverse regression for supervised dimension reduction with small datasets
    Cai, Xin
    Lin, Guang
    Li, Jinglai
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2021, 91 (14) : 2817 - 2832
  • [44] Sliced Inverse Regression With Adaptive Spectral Sparsity for Dimension Reduction
    Xu, Xiao-Lin
    Ren, Chuan-Xian
    Wu, Ran-Chao
    Yan, Hong
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (03) : 759 - 771
  • [45] Robust dimension reduction using sliced inverse median regression
    Eliana Christou
    [J]. Statistical Papers, 2020, 61 : 1799 - 1818
  • [46] Robust dimension reduction using sliced inverse median regression
    Christou, Eliana
    [J]. STATISTICAL PAPERS, 2020, 61 (05) : 1799 - 1818
  • [47] AN RKHS FORMULATION OF THE INVERSE REGRESSION DIMENSION-REDUCTION PROBLEM
    Hsing, Tailen
    Ren, Haobo
    [J]. ANNALS OF STATISTICS, 2009, 37 (02): : 726 - 755
  • [48] Asymptotic results for nonparametric regression estimators after sufficient dimension reduction estimation
    Forzani, Liliana
    Rodriguez, Daniela
    Sued, Mariela
    [J]. TEST, 2024,
  • [49] On estimating regression-based causal effects using sufficient dimension reduction
    Luo, Wei
    Zhu, Yeying
    Ghosh, Debashis
    [J]. BIOMETRIKA, 2017, 104 (01) : 51 - 65
  • [50] A novel moment-based sufficient dimension reduction approach in multivariate regression
    Yoo, Jae Keun
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2008, 52 (07) : 3843 - 3851