Detecting influential observations in principal components and common principal components

被引:5
|
作者
Boente, Graciela [1 ,2 ]
Pires, Ana M. [3 ,4 ]
Rodrigues, Isabel M. [3 ,4 ]
机构
[1] Univ Buenos Aires, Fac Ciencias Exactas & Nat, RA-1053 Buenos Aires, DF, Argentina
[2] Consejo Nacl Invest Cient & Tecn, RA-1033 Buenos Aires, DF, Argentina
[3] Univ Tecn Lisboa, Inst Super Tecn, Dept Matemat, Lisbon, Portugal
[4] Univ Tecn Lisboa, Inst Super Tecn, CEMAT, Lisbon, Portugal
关键词
Common principal components; Detection of outliers; Influence functions; Robust estimation; OUTLIER IDENTIFICATION; ESTIMATORS; MODEL; PCA;
D O I
10.1016/j.csda.2010.01.001
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Detecting outlying observations is an important step in any analysis, even when robust estimates are used. In particular, the robustified Mahalanobis distance is a natural measure of outlyingness if one focuses on ellipsoidal distributions. However, it is well known that the asymptotic chi-square approximation for the cutoff value of the Mahalanobis distance based on several robust estimates (like the minimum volume ellipsoid, the minimum covariance determinant and the S-estimators) is not adequate for detecting atypical observations in small samples from the normal distribution. In the multi-population setting and under a common principal components model, aggregated measures based on standardized empirical influence functions are used to detect observations with a significant impact on the estimators. As in the one-population setting, the cutoff values obtained from the asymptotic distribution of those aggregated measures are not adequate for small samples. More appropriate cutoff values, adapted to the sample sizes, can be computed by using a cross-validation approach. Cutoff values obtained from a Monte Carlo study using S-estimators are provided for illustration. A real data set is also analyzed. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:2967 / 2975
页数:9
相关论文
共 50 条
  • [1] COMMON FUNCTIONAL PRINCIPAL COMPONENTS
    Benko, Michal
    Haerdle, Wolfgang
    Kneip, Alois
    ANNALS OF STATISTICS, 2009, 37 (01): : 1 - 34
  • [2] Detecting the Dimensionality for Principal Components Model
    Wang, Liuxia
    Li, Yulin
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2010, 39 (06) : 1073 - 1082
  • [3] Efficient R-Estimation of Principal and Common Principal Components
    Hallin, Marc
    Paindaveine, Davy
    Verdebout, Thomas
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2014, 109 (507) : 1071 - 1083
  • [4] Stepwise estimation of common principal components
    Trendafilov, Nickolay T.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2010, 54 (12) : 3446 - 3457
  • [5] Discriminant analysis with common principal components
    Zhu, Mu
    BIOMETRIKA, 2006, 93 (04) : 1018 - 1024
  • [6] A robust approach to common principal components
    Boente, G
    Orellana, L
    STATISTICS IN GENETICS AND IN THE ENVIRONMENTAL SCIENCES, 2001, : 117 - 145
  • [7] Procedure for the Selection of Principal Components in Principal Components Regression
    Kim, Bu-Yong
    Shin, Myung-Hee
    KOREAN JOURNAL OF APPLIED STATISTICS, 2010, 23 (05) : 967 - 975
  • [8] Comparison of Principal Components Analysis, Independent Components Analysis and Common Components Analysis
    Rutledge D.N.
    Journal of Analysis and Testing, 2018, 2 (3) : 235 - 248
  • [9] Principal components
    Girshick, MA
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1936, 31 (193) : 519 - 528
  • [10] Robust tests for the common principal components model
    Boente, Gradela
    Pires, Ana M.
    Rodrigues, Isabel M.
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2009, 139 (04) : 1332 - 1347