Optimal robust estimates using the Kullback-Leibler divergence

被引:7
|
作者
Yohai, Victor J. [1 ,2 ]
机构
[1] Univ Buenos Aires, RA-1053 Buenos Aires, DF, Argentina
[2] Consejo Nacl Invest Cient & Tecn, RA-1033 Buenos Aires, DF, Argentina
关键词
D O I
10.1016/j.spl.2008.01.042
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We define two measures of the performance of an estimating functional T of a multi-dimensional parameter, based on the Kullback-Leibler (KL) divergence. The first one is the KL sensitivity which measures the degree of robustness of the estimate under infinitesimal outlier contamination and the second one is the KL efficiency, which measures the asymptotic efficiency of the estimate based on T when the assumed model holds. Using these two measures we define optimal robust M-estimates using the Hampel approach. The optimal estimates are defined by maximizing the KL efficiency subject to a bound on the KL sensitivity. In this paper we show that these estimates coincide with the optimal estimates corresponding to another Hampel problem studied by Stahel [Stahel, W.A., 1981. Robust estimation, infinitesimal optimality and covariance matrix estimators. Ph.D. Thesis, ETH, Zurich]: to minimize the trace of a standardized asymptotic covariance matrix subject to a bound on the norm of a standardized gross error sensitivity, where both the asymptotic covariance matrix and the gross error sensitivity are standardized by means of the Fisher information matrix. (C) 2008 Elsevier B.V. All rights reserved.
引用
下载
收藏
页码:1811 / 1816
页数:6
相关论文
共 50 条
  • [31] Android Malware Detection Using Kullback-Leibler Divergence
    Cooper, Vanessa N.
    Haddad, Hisham M.
    Shahriar, Hossain
    ADCAIJ-ADVANCES IN DISTRIBUTED COMPUTING AND ARTIFICIAL INTELLIGENCE JOURNAL, 2014, 3 (02): : 17 - 24
  • [32] An Asymptotic Test for Bimodality Using The Kullback-Leibler Divergence
    Contreras-Reyes, Javier E.
    SYMMETRY-BASEL, 2020, 12 (06):
  • [33] Estimating Kullback-Leibler Divergence Using Kernel Machines
    Ahuja, Kartik
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 690 - 696
  • [34] Human promoter recognition using kullback-leibler divergence
    Zeng, Ja
    Cao, Xiao-Qin
    Yan, Hong
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 3319 - 3325
  • [35] A generalization of the Kullback-Leibler divergence and its properties
    Yamano, Takuya
    JOURNAL OF MATHEMATICAL PHYSICS, 2009, 50 (04)
  • [36] Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
    Yang, Pengfei
    Chen, Biao
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (04) : 2360 - 2373
  • [37] Optimal Viewpoint Selection Based on Aesthetic Composition Evaluation Using Kullback-Leibler Divergence
    Lan, Kai
    Sekiyama, Kosuke
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2016, PT I, 2016, 9834 : 433 - 443
  • [38] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    ENTROPY, 2021, 23 (09)
  • [39] Fault detection in dynamic systems using the Kullback-Leibler divergence
    Xie, Lei
    Zeng, Jiusun
    Kruger, Uwe
    Wang, Xun
    Geluk, Jaap
    CONTROL ENGINEERING PRACTICE, 2015, 43 : 39 - 48
  • [40] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670