Sufficient dimension reduction in regressions through cumulative Hessian directions

被引:2
|
作者
Zhang, Li-Mei [2 ]
Zhu, Li-Ping [1 ]
Zhu, Li-Xing [3 ,4 ]
机构
[1] E China Normal Univ, Shanghai 200062, Peoples R China
[2] Renmin Univ China, Beijing, Peoples R China
[3] Hong Kong Baptist Univ, Hong Kong, Hong Kong, Peoples R China
[4] Yunnan Univ Finance & Econ, Yunnan, Peoples R China
关键词
Central subspace; Diverging parameters; Inverse regression; Sufficient dimension reduction; ASYMPTOTICS;
D O I
10.1007/s11222-010-9172-5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
To reduce the predictors dimension without loss of information on the regression, we develop in this paper a sufficient dimension reduction method which we term cumulative Hessian directions. Unlike many other existing sufficient dimension reduction methods, the estimation of our proposal avoids completely selecting the tuning parameters such as the number of slices in slicing estimation or the bandwidth in kernel smoothing. We also investigate the asymptotic properties of our proposal when the predictors dimension diverges. Illustrations through simulations and an application are presented to evidence the efficacy of our proposal and to compare it with existing methods.
引用
收藏
页码:325 / 334
页数:10
相关论文
共 50 条