Robust statistical inference based on the C-divergence family

被引:5
|
作者
Maji, Avijit [1 ]
Ghosh, Abhik [2 ]
Basu, Ayanendranath [2 ]
Pardo, Leandro [3 ]
机构
[1] Reserve Bank India, Dept Stat & Informat Management, Patna 800001, Bihar, India
[2] Indian Stat Inst, 203 BT Rd, Kolkata 700108, India
[3] Univ Complutense Madrid, Dept Stat & OR, E-28040 Madrid, Spain
关键词
C-Divergence; Density power divergence; Generalized power divergence; Power divergence; MINIMUM HELLINGER DISTANCE; EFFICIENCY;
D O I
10.1007/s10463-018-0678-5
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This paper describes a family of divergences, named herein as the C-divergence family, which is a generalized version of the power divergence family and also includes the density power divergence family as a particular member of this class. We explore the connection of this family with other divergence families and establish several characteristics of the corresponding minimum distance estimator including its asymptotic distribution under both discrete and continuous models; we also explore the use of the C-divergence family in parametric tests of hypothesis. We study the influence function of these minimum distance estimators, in both the first and second order, and indicate the possible limitations of the first-order influence function in this case. We also briefly study the breakdown results of the corresponding estimators. Some simulation results and real data examples demonstrate the small sample efficiency and robustness properties of the estimators.
引用
收藏
页码:1289 / 1322
页数:34
相关论文
共 50 条
  • [21] Unifying divergence minimization and statistical inference via convex duality
    Altun, Yasemin
    Smola, Alex
    [J]. LEARNING THEORY, PROCEEDINGS, 2006, 4005 : 139 - 153
  • [22] Adaptation of the tuning parameter in general Bayesian inference with robust divergence
    Yonekura, Shouto
    Sugasawa, Shonosuke
    [J]. STATISTICS AND COMPUTING, 2023, 33 (02)
  • [23] Adaptation of the tuning parameter in general Bayesian inference with robust divergence
    Shouto Yonekura
    Shonosuke Sugasawa
    [J]. Statistics and Computing, 2023, 33
  • [24] STATISTICAL INFERENCE BASED ON ROBUST LOW-RANK DATA MATRIX APPROXIMATION
    Feng, Xingdong
    He, Xuming
    [J]. ANNALS OF STATISTICS, 2014, 42 (01): : 190 - 210
  • [25] Statistical Inference for Topp-Leone-generated Family of Distributions Based on Records
    Arshad, Mohd
    Jamal, Qazi Azhad
    [J]. JOURNAL OF STATISTICAL THEORY AND APPLICATIONS, 2019, 18 (01): : 65 - 78
  • [26] Statistical inference for multinomial populations based on a double index family of test statistics
    Meselidis, Christos
    Karagrigoriou, Alex
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (10) : 1773 - 1792
  • [27] Statistical Inference for Topp—Leone-generated Family of Distributions Based on Records
    Mohd Arshad
    Qazi Azhad Jamal
    [J]. Journal of Statistical Theory and Applications, 2019, 18 : 65 - 78
  • [28] Differentially Private Statistical Inference through β-Divergence One Posterior Sampling
    Jewson, Jack
    Ghalebikesabi, Sahra
    Holmes, Chris
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [29] Towards robust statistical inference for complex computer models
    Oberpriller, Johannes
    Cameron, David R.
    Dietze, Michael C.
    Hartig, Florian
    [J]. ECOLOGY LETTERS, 2021, 24 (06) : 1251 - 1261
  • [30] Robust statistical inference for longitudinal data with nonignorable dropouts
    Shao, Yujing
    Ma, Wei
    Wang, Lei
    [J]. STATISTICS, 2022, 56 (05) : 1072 - 1094