THE OPTIMALITY OF ITERATIVE MAXIMUM PENALIZED LIKELIHOOD ALGORITHMS FOR RIDGE-REGRESSION

被引:1
|
作者
YU, SH
机构
[1] Center for Mathematics, Its Applications Australian National University, Canberra
关键词
ESTIMATE MAXIMIZE (EM) ALGORITHM; ONE-STEP-LATE (OSL) ALGORITHM; RIDGE REGRESSION; LINEAR STATIONARY METHODS OF 1ST DEGREE; OPTIMUM EXTRAPOLATED METHODS;
D O I
10.1016/0893-9659(94)90007-8
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Initially, it is proved that the EM (estimate, maximize) and OSL (one-step-late) algorithms, when applied to ridge regression problems, are special cases of the so-called linear stationary methods of the first degree for the underlying system of linear equations. It is shown that, although the EM and OSL algorithms converge, their optimum extrapolated counterparts have faster convergence. Using an incomplete data argument, an alternative interpretation of the extrapolated methods is given, which allows the full potential of optimum extrapolated methods to be exploited.
引用
收藏
页码:35 / 39
页数:5
相关论文
共 50 条