ON POST DIMENSION REDUCTION STATISTICAL INFERENCE

被引:11
|
作者
Kim, Kyongwon [1 ]
Li, Bing [1 ]
Yu, Zhou [2 ]
Li, Lexin [3 ]
机构
[1] Penn State Univ, Dept Stat, University Pk, PA 16802 USA
[2] East China Normal Univ, Sch Stat, KLATASDS MOE, Shanghai, Peoples R China
[3] Univ Calif Berkeley, Dept Biostat & Epidemiol, Berkeley, CA 94720 USA
来源
ANNALS OF STATISTICS | 2020年 / 48卷 / 03期
基金
中国国家自然科学基金;
关键词
Central subspace; directional regression; estimating equations; generalized method of moment; influence function; sliced inverse regression; Von Mises expansion; PRINCIPAL HESSIAN DIRECTIONS; SLICED INVERSE REGRESSION; SAMPLE PROPERTIES;
D O I
10.1214/19-AOS1859
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The methodologies of sufficient dimension reduction have undergone extensive developments in the past three decades. However, there has been a lack of systematic and rigorous development of post dimension reduction inference, which has seriously hindered its applications. The current common practice is to treat the estimated sufficient predictors as the true predictors and use them as the starting point of the downstream statistical inference. However, this naive inference approach would grossly overestimate the confidence level of an interval, or the power of a test, leading to the distorted results. In this paper, we develop a general and comprehensive framework of post dimension reduction inference, which can accommodate any dimension reduction method and model building method, as long as their corresponding influence functions are available. Within this general framework, we derive the influence functions and present the explicit post reduction formulas for the combinations of numerous dimension reduction and model building methods. We then develop post reduction inference methods for both confidence interval and hypothesis testing. We investigate the finite-sample performance of our procedures by simulations and a real data analysis.
引用
收藏
页码:1567 / 1592
页数:26
相关论文
共 50 条
  • [2] Statistical inference for the tangency portfolio in high dimension
    Karlsson, Sune
    Mazur, Stepan
    Muhinyuza, Stanislas
    [J]. STATISTICS, 2021, 55 (03) : 532 - 560
  • [3] Matching Using Sufficient Dimension Reduction for Causal Inference
    Luo, Wei
    Zhu, Yeying
    [J]. JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2020, 38 (04) : 888 - 900
  • [4] Vectorial Dimension Reduction for Tensors Based on Bayesian Inference
    Ju, Fujiao
    Sun, Yanfeng
    Gao, Junbin
    Hu, Yongli
    Yin, Baocai
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (10) : 4579 - 4592
  • [5] Statistical modelling via dimension reduction methods
    Chong, YS
    Wang, JL
    [J]. NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS, 1997, 30 (06) : 3561 - 3568
  • [6] Data-driven algorithms for dimension reduction in causal inference
    Persson, Emma
    Haggstrom, Jenny
    Waernbaum, Ingeborg
    de Luna, Xavier
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2017, 105 : 280 - 292
  • [7] Bayesian Clustering of Animal Abundance Trends for Inference and Dimension Reduction
    Johnson, Devin S.
    Ream, Rolf R.
    Towell, Rod G.
    Williams, Michael T.
    Guerrero, Juan D. Leon
    [J]. JOURNAL OF AGRICULTURAL BIOLOGICAL AND ENVIRONMENTAL STATISTICS, 2013, 18 (03) : 299 - 313
  • [8] Bayesian Clustering of Animal Abundance Trends for Inference and Dimension Reduction
    Devin S. Johnson
    Rolf R. Ream
    Rod G. Towell
    Michael T. Williams
    Juan D. Leon Guerrero
    [J]. Journal of Agricultural, Biological, and Environmental Statistics, 2013, 18 : 299 - 313
  • [9] Inference on the primary parameter of interest with the aid of dimension reduction estimation
    Li, Lexin
    Zhu, Liping
    Zhu, Lixing
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2011, 73 : 59 - 80
  • [10] Learning Heterogeneity in Causal Inference Using Sufficient Dimension Reduction
    Luo, Wei
    Wu, Wenbo
    Zhu, Yeying
    [J]. JOURNAL OF CAUSAL INFERENCE, 2019, 7 (01)