Locally and globally robust Penalized Trimmed Squares regression

被引:3
|
作者
Avramidis, A. [1 ]
Zioutas, G. [1 ]
机构
[1] Aristotle Univ Thessaloniki, Fac Technol, Gen Dept, Thessaloniki 54124, Greece
关键词
Robust regression; Monte Carlo simulation; Penalized Trimmed Squares; Unmasking outliers; Bounded influence; HIGH BREAKDOWN-POINT; LINEAR-REGRESSION; SUPPORT VECTORS; FAST ALGORITHM; OUTLIERS; MODELS; SETS;
D O I
10.1016/j.simpat.2010.06.001
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Multiple outliers are frequently encountered in regression models used in business economics engineers and applied studies The ordinary least squares (OLS) estimator fails even in the presence of a single outlying observation To overcome this problem a class of high breakdown robust estimators (insensitive to outliers up to 50% of the data sample) has been introduced as an alternative to the least squares regression Among them the Penalized Trimmed Squares (PTS) is a reasonable high breakdown estimator This estimator is defined by the minimization of an objective function where penalty cost for deleting an outlier is added which serves as an upper bound on the residual error for any feasible regression line Since the PTS does not require presetting the number of outliers to delete from the data set it has better efficiency with respect to other estimators However small outliers remain influential causing bias to the regression line In this work we present a new class of regression estimates called generalized PTS (GPTS) The new GPTS estimator is defined as the PTS but with penalties suitable for bounding the influence function of all observations We show with some numerical examples and a Monte Carlo simulation study that the generalized PTS estimate has very good performance for both robust and efficiency properties (C) 2010 Elsevier B V All rights reserved
引用
收藏
页码:148 / 160
页数:13
相关论文
共 50 条
  • [1] A robust regression based on weighted LSSVM and penalized trimmed squares
    Liu, Jianyong
    Wang, Yong
    Fu, Chengqun
    Guo, Jie
    Yu, Qin
    CHAOS SOLITONS & FRACTALS, 2016, 89 : 328 - 334
  • [2] A fast algorithm for robust regression with penalised trimmed squares
    L. Pitsoulis
    G. Zioutas
    Computational Statistics, 2010, 25 : 663 - 689
  • [3] A fast algorithm for robust regression with penalised trimmed squares
    Pitsoulis, L.
    Zioutas, G.
    COMPUTATIONAL STATISTICS, 2010, 25 (04) : 663 - 689
  • [4] PENALIZED TRIMMED SQUARES AND A MODIFICATION OF SUPPORT VECTORS FOR UNMASKING OUTLIERS IN LINEAR REGRESSION
    Zioutas, G.
    Avramidis, A.
    Pitsoulis, L.
    REVSTAT-STATISTICAL JOURNAL, 2007, 5 (01) : 115 - 136
  • [5] A class of locally and globally robust regression estimates
    Ferretti, N
    Kelmansky, D
    Yohai, VJ
    Zamar, RH
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1999, 94 (445) : 174 - 188
  • [6] Combining locally and globally robust estimates for regression
    Hernández, S
    Yohai, VJ
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2003, 113 (02) : 633 - 661
  • [7] Nonlinear Robust Modeling Base on Least Trimmed Squares Regression
    Bao Xin
    Dai Liankui
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 5360 - 5365
  • [8] Robust gene-environment interaction analysis using penalized trimmed regression
    Xu, Yaqing
    Wu, Mengyun
    Ma, Shuangge
    Ahmed, Syed Ejaz
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2018, 88 (18) : 3502 - 3528
  • [9] A robust weighted least squares support vector regression based on least trimmed squares
    Chen, Chuanfa
    Yan, Changqing
    Li, Yanyan
    NEUROCOMPUTING, 2015, 168 : 941 - 946
  • [10] Partial least trimmed squares regression
    Xie, Zhonghao
    Feng, Xi'an
    Chen, Xiaojing
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2022, 221