Locally and globally robust Penalized Trimmed Squares regression

被引:3
|
作者
Avramidis, A. [1 ]
Zioutas, G. [1 ]
机构
[1] Aristotle Univ Thessaloniki, Fac Technol, Gen Dept, Thessaloniki 54124, Greece
关键词
Robust regression; Monte Carlo simulation; Penalized Trimmed Squares; Unmasking outliers; Bounded influence; HIGH BREAKDOWN-POINT; LINEAR-REGRESSION; SUPPORT VECTORS; FAST ALGORITHM; OUTLIERS; MODELS; SETS;
D O I
10.1016/j.simpat.2010.06.001
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Multiple outliers are frequently encountered in regression models used in business economics engineers and applied studies The ordinary least squares (OLS) estimator fails even in the presence of a single outlying observation To overcome this problem a class of high breakdown robust estimators (insensitive to outliers up to 50% of the data sample) has been introduced as an alternative to the least squares regression Among them the Penalized Trimmed Squares (PTS) is a reasonable high breakdown estimator This estimator is defined by the minimization of an objective function where penalty cost for deleting an outlier is added which serves as an upper bound on the residual error for any feasible regression line Since the PTS does not require presetting the number of outliers to delete from the data set it has better efficiency with respect to other estimators However small outliers remain influential causing bias to the regression line In this work we present a new class of regression estimates called generalized PTS (GPTS) The new GPTS estimator is defined as the PTS but with penalties suitable for bounding the influence function of all observations We show with some numerical examples and a Monte Carlo simulation study that the generalized PTS estimate has very good performance for both robust and efficiency properties (C) 2010 Elsevier B V All rights reserved
引用
收藏
页码:148 / 160
页数:13
相关论文
共 50 条
  • [21] Multivariate least-trimmed squares regression estimator
    Jung, KM
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2005, 48 (02) : 307 - 316
  • [22] Computing least trimmed squares regression with the forward search
    Atkinson, AC
    Cheng, TC
    STATISTICS AND COMPUTING, 1999, 9 (04) : 251 - 263
  • [23] Robust Radial Basis Function Networks Based on Least Trimmed Squares-Support Vector Regression
    Su, Shun-Feng
    Jeng, Jin-Tsong
    Liu, Yue-Shiang
    Chuang, Chen-Chia
    Rudas, Imre J.
    PROCEEDINGS OF THE 2013 JOINT IFSA WORLD CONGRESS AND NAFIPS ANNUAL MEETING (IFSA/NAFIPS), 2013, : 1 - 6
  • [24] Locally sparse and robust partial least squares in scalar-on-function regression
    Gurer, Sude
    Shang, Han Lin
    Mandal, Abhijit
    Beyaztas, Ufuk
    STATISTICS AND COMPUTING, 2024, 34 (05)
  • [25] QUADRATIC DEVIATION OF PENALIZED MEAN SQUARES REGRESSION ESTIMATES
    DOUKHAN, P
    GASSIAT, E
    JOURNAL OF MULTIVARIATE ANALYSIS, 1992, 41 (01) : 89 - 101
  • [26] Penalized least squares regression methods and applications to neuroimaging
    Bunea, Florentina
    She, Yiyuan
    Ombao, Hernando
    Gongvatana, Assawin
    Devlin, Kate
    Cohen, Ronald
    NEUROIMAGE, 2011, 55 (04) : 1519 - 1527
  • [27] Nonparametric regression estimation using penalized least squares
    Kohler, M
    Krzyzak, A
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2001, 47 (07) : 3054 - 3058
  • [28] New algorithms for computing the least trimmed squares regression estimator
    Agulló, J
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2001, 36 (04) : 425 - 439
  • [29] Trimmed least squares estimator as best trimmed linear conditional estimator for linear regression model
    Chen, LA
    Thompson, P
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1998, 27 (07) : 1835 - 1849
  • [30] Robust MAVE through nonconvex penalized regression
    Zhang, Jing
    Wang, Qin
    Mays, D'Arcy
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2021, 160