Shrinkage and absolute penalty estimation in linear regression models

被引:11
|
作者
Ahmed, S. Ejaz [1 ]
Raheem, S. M. Enayetur [2 ]
机构
[1] Brock Univ, Fac Math & Sci, St Catharines, ON, Canada
[2] Univ Wisconsin Green Bay, Nat & Appl Sci, Green Bay, WI 54311 USA
关键词
shrinkage estimation; absolute penalty estimation; LASSO; adaptive LASSO; SCAD;
D O I
10.1002/wics.1232
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In predicting a response variable using multiple linear regression model, several candidate models may be available which are subsets of the full model. Shrinkage estimators borrow information from the full model and provides a hybrid estimate of the regression parameters by shrinking the full model estimates toward the candidate submodel. The process introduces bias in the estimation but reduces the overall prediction error that offsets the bias. In this article, we give an overview of shrinkage estimators and their asymptotic properties. A real data example is given and a Monte Carlo simulation study is carried out to evaluate the performance of shrinkage estimators compared to the absolute penalty estimators such as least absolute shrinkage and selection operator (LASSO), adaptive LASSO and smoothly clipped absolute deviation (SCAD) based on prediction errors criterion in a multiple linear regression setup. (C) 2012 Wiley Periodicals, Inc.
引用
收藏
页码:541 / 553
页数:13
相关论文
共 50 条