Optimal robust estimates using the Kullback-Leibler divergence

被引:7
|
作者
Yohai, Victor J. [1 ,2 ]
机构
[1] Univ Buenos Aires, RA-1053 Buenos Aires, DF, Argentina
[2] Consejo Nacl Invest Cient & Tecn, RA-1033 Buenos Aires, DF, Argentina
关键词
D O I
10.1016/j.spl.2008.01.042
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We define two measures of the performance of an estimating functional T of a multi-dimensional parameter, based on the Kullback-Leibler (KL) divergence. The first one is the KL sensitivity which measures the degree of robustness of the estimate under infinitesimal outlier contamination and the second one is the KL efficiency, which measures the asymptotic efficiency of the estimate based on T when the assumed model holds. Using these two measures we define optimal robust M-estimates using the Hampel approach. The optimal estimates are defined by maximizing the KL efficiency subject to a bound on the KL sensitivity. In this paper we show that these estimates coincide with the optimal estimates corresponding to another Hampel problem studied by Stahel [Stahel, W.A., 1981. Robust estimation, infinitesimal optimality and covariance matrix estimators. Ph.D. Thesis, ETH, Zurich]: to minimize the trace of a standardized asymptotic covariance matrix subject to a bound on the norm of a standardized gross error sensitivity, where both the asymptotic covariance matrix and the gross error sensitivity are standardized by means of the Fisher information matrix. (C) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:1811 / 1816
页数:6
相关论文
共 50 条
  • [21] Novel robust g and h charts using the generalized Kullback-Leibler divergence
    Park, Chanseok
    Wang, Min
    Ouyang, Linhan
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 176
  • [22] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [23] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [24] Model parameter learning using Kullback-Leibler divergence
    Lin, Chungwei
    Marks, Tim K.
    Pajovic, Milutin
    Watanabe, Shinji
    Tung, Chih-kuan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 491 : 549 - 559
  • [25] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [26] Detecting abnormal situations using the Kullback-Leibler divergence
    Zeng, Jiusun
    Kruger, Uwe
    Geluk, Jaap
    Wang, Xun
    Xie, Lei
    AUTOMATICA, 2014, 50 (11) : 2777 - 2786
  • [27] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [28] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [29] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [30] Anomaly Detection Using the Kullback-Leibler Divergence Metric
    Afgani, Mostafa
    Sinanovic, Sinan
    Haas, Harald
    ISABEL: 2008 FIRST INTERNATIONAL SYMPOSIUM ON APPLIED SCIENCES IN BIOMEDICAL AND COMMMUNICATION TECHNOLOGIES, 2008, : 197 - 201