Second-order bias-corrected AIC in multivariate normal linear models under non-normality

被引:8
|
作者
Yanagihara, Hirokazu [1 ]
Kamo, Ken-Ichi [2 ]
Tonda, Tetsuji [3 ]
机构
[1] Hiroshima Univ, Grad Sch Sci, Dept Math, Higashihiroshima 7398626, Japan
[2] Sapporo Med Univ, Dept Liberal Arts & Sci, Chuo Ku, Sapporo, Hokkaido 0608543, Japan
[3] Hiroshima Univ, Res Inst Radiat Biol & Med, Dept Environmetr & Biometr, Minami Ku, Hiroshima 7348553, Japan
关键词
Akaike's information criterion; bias correction; Kullback-Leibler information; model misspecification; normal assumption; overspecified model; selection of variables; predicted residual sum of squares; robustness to non-normality; variance of information criterion; INFORMATION CRITERION; REGRESSION-MODELS; CROSS-VALIDATION; KURTOSIS; SELECTION; SKEWNESS;
D O I
10.1002/cjs.10090
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper deals with a bias correction of Akaike's information criterion (AIC) for selecting variables in multivariate normal linear regression models when the true distribution of observation is an unknown non-normal distribution. It is well known that the bias of AIC is O(1), and there are a number of the first-order bias-corrected AICs which improve the bias to O(n(-1)), where n is the sample size. A new information criterion is proposed by slightly adjusting the first-order bias-corrected AIC. Although the adjustment is achieved by merely using constant coefficients, the bias of the new criterion is reduced to O(n(-2)). Then, a variance of the new criterion is also improved. Through numerical experiments, we verify that our criterion is superior to others. The Canadian Journal of Statistics 39: 126-146; 2011 (C) 2011 Statistical Society of Canada
引用
收藏
页码:126 / 146
页数:21
相关论文
共 16 条