Minimum average deviance estimation for sufficient dimension reduction

被引:4
|
作者
Adragni, Kofi P. [1 ]
机构
[1] Univ Maryland Baltimore Cty, Baltimore, MD 21228 USA
关键词
Exponential family; GLM; local regression; Stiefel manifold; prediction; SLICED INVERSE REGRESSION; MODELS;
D O I
10.1080/00949655.2017.1392523
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Sufficient dimension reduction methods aim to reduce the dimensionality of predictors while preserving regression information relevant to the response. In this article, we develop Minimum Average Deviance Estimation (MADE) methodology for sufficient dimension reduction. The purpose of MADE is to generalize Minimum Average Variance Estimation (MAVE) beyond its assumption of additive errors to settings where the outcome follows an exponential family distribution. As in MAVE, a local likelihood approach is used to learn the form of the regression function from the data and the main parameter of interest is a dimension reduction subspace. To estimate this parameter within its natural space, we propose an iterative algorithm where one step utilizes optimization on the Stiefel manifold. MAVE is seen to be a special case of MADE in the case of Gaussian outcomes with a common variance. Several procedures are considered to estimate the reduced dimension and to predict the outcome for an arbitrary covariate value. Initial simulations and data analysis examples yield encouraging results and invite further exploration of the methodology.
引用
收藏
页码:411 / 431
页数:21
相关论文
共 50 条
  • [31] Sparse sufficient dimension reduction
    Li, Lexin
    [J]. BIOMETRIKA, 2007, 94 (03) : 603 - 613
  • [32] Concordance-based estimation approaches for the optimal sufficient dimension reduction score
    Wang, Shao-Hsuan
    Chiang, Chin-Tsang
    [J]. SCANDINAVIAN JOURNAL OF STATISTICS, 2020, 47 (03) : 662 - 689
  • [33] Asymptotic results for nonparametric regression estimators after sufficient dimension reduction estimation
    Forzani, Liliana
    Rodriguez, Daniela
    Sued, Mariela
    [J]. TEST, 2024,
  • [34] Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities
    Sasaki, Hiroaki
    Tangkaratt, Voot
    Niu, Gang
    Sugiyama, Masashi
    [J]. NEURAL COMPUTATION, 2018, 30 (02) : 477 - 504
  • [35] Sufficient Dimension Reduction via Squared-Loss Mutual Information Estimation
    Suzuki, Taiji
    Sugiyama, Masashi
    [J]. NEURAL COMPUTATION, 2013, 25 (03) : 725 - 758
  • [37] Sufficient Dimension Reduction for Censored Regressions
    Lu, Wenbin
    Li, Lexin
    [J]. BIOMETRICS, 2011, 67 (02) : 513 - 523
  • [38] Sufficient Dimension Reduction for Censored Predictors
    Tomassi, Diego
    Forzani, Liliana
    Bura, Efstathia
    Pfeiffer, Ruth
    [J]. BIOMETRICS, 2017, 73 (01) : 220 - 231
  • [39] On hierarchical clustering in sufficient dimension reduction
    Yoo, Chaeyeon
    Yoo, Younju
    Um, Hye Yeon
    Yoo, Jae Keun
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2020, 27 (04) : 431 - 443
  • [40] Sufficient dimension reduction with additional information
    Hung, Hung
    Liu, Chih-Yen
    Lu, Henry Horng-Shing
    [J]. BIOSTATISTICS, 2016, 17 (03) : 405 - 421