Optimal robust estimates using the Kullback-Leibler divergence

被引:7
|
作者
Yohai, Victor J. [1 ,2 ]
机构
[1] Univ Buenos Aires, RA-1053 Buenos Aires, DF, Argentina
[2] Consejo Nacl Invest Cient & Tecn, RA-1033 Buenos Aires, DF, Argentina
关键词
D O I
10.1016/j.spl.2008.01.042
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We define two measures of the performance of an estimating functional T of a multi-dimensional parameter, based on the Kullback-Leibler (KL) divergence. The first one is the KL sensitivity which measures the degree of robustness of the estimate under infinitesimal outlier contamination and the second one is the KL efficiency, which measures the asymptotic efficiency of the estimate based on T when the assumed model holds. Using these two measures we define optimal robust M-estimates using the Hampel approach. The optimal estimates are defined by maximizing the KL efficiency subject to a bound on the KL sensitivity. In this paper we show that these estimates coincide with the optimal estimates corresponding to another Hampel problem studied by Stahel [Stahel, W.A., 1981. Robust estimation, infinitesimal optimality and covariance matrix estimators. Ph.D. Thesis, ETH, Zurich]: to minimize the trace of a standardized asymptotic covariance matrix subject to a bound on the norm of a standardized gross error sensitivity, where both the asymptotic covariance matrix and the gross error sensitivity are standardized by means of the Fisher information matrix. (C) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:1811 / 1816
页数:6
相关论文
共 50 条
  • [1] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [2] Robust Active Stereo Vision Using Kullback-Leibler Divergence
    Wang, Yongchang
    Liu, Kai
    Hao, Qi
    Wang, Xianwang
    Lau, Daniel L.
    Hassebrook, Laurence G.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (03) : 548 - 563
  • [3] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [4] Robust parameter design based on Kullback-Leibler divergence
    Zhou, XiaoJian
    Lin, Dennis K. J.
    Hu, Xuelong
    Jiang, Ting
    [J]. COMPUTERS & INDUSTRIAL ENGINEERING, 2019, 135 : 913 - 921
  • [5] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [6] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    [J]. ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [7] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    [J]. ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [8] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    [J]. CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [9] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [10] A Robust Cooperative Spectrum Sensing Based on Kullback-Leibler Divergence
    Hiep Vu-Van
    Koo, Insoo
    [J]. IEICE TRANSACTIONS ON COMMUNICATIONS, 2012, E95B (04) : 1286 - 1290