Unifying the derivations for the Akaike and corrected Akaike information criteria

被引:422
|
作者
Cavanaugh, JE [1 ]
机构
[1] UNIV MISSOURI, DEPT STAT, COLUMBIA, MO 65211 USA
关键词
AIC; AICc; information theory; Kullback-Leibler information; model selection;
D O I
10.1016/S0167-7152(96)00128-9
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The Akaike (1973, 1974) information criterion, AIC, and the corrected Akaike information criterion (Hurvich and Tsai, 1989), AICc, were both designed as estimators of the expected Kullback-Leibler discrepancy between the model generating the data and a fitted candidate model. AIC is justified in a very general framework, and as a result, offers a crude estimator of the expected discrepancy: one which exhibits a potentially high degree of negative bias in small-sample applications (Hurvich and Tsai, 1989). AICc corrects for this bias, but is less broadly applicable than AIC since its justification depends upon the form of the candidate model (Hurvich and Tsai, 1989, 1993; Hurvich et al., 1990; Bedrick and Tsai, 1994). Although AIC and AICc share the same objective, the derivations of the criteria proceed along very different lines, making it difficult to reconcile how AICc improves upon the approximations leading to AIC. To address this issue, we present a derivation which unifies the justifications of AIC and AICc in the linear regression framework.
引用
收藏
页码:201 / 208
页数:8
相关论文
共 50 条