Novel robust g and h charts using the generalized Kullback-Leibler divergence

被引:2
|
作者
Park, Chanseok [1 ]
Wang, Min [2 ]
Ouyang, Linhan [3 ]
机构
[1] Pusan Natl Univ, Dept Ind Engn, Appl Stat Lab, Pusan 46241, South Korea
[2] Univ Texas San Antonio, Dept Management Sci & Stat, San Antonio, TX 78249 USA
[3] Nanjing Univ Aeronaut & Astronaut, Coll Econ & Management, Nanjing 211106, Peoples R China
基金
中国国家自然科学基金; 新加坡国家研究基金会; 中国博士后科学基金;
关键词
Attribute control charts; Data contamination; Kullback-Leibler divergence; Robust estimation; Run length properties; PARAMETER-ESTIMATION; PERFORMANCE; MINIMUM;
D O I
10.1016/j.cie.2022.108951
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
It is well known that the performance of g and h control charts heavily depends on how accurate the estimation of an unknown process parameter is. However, conventional methods, such as the method of moments and the maximum likelihood method, are easily influenced by data contamination. Thus, the performance of control charts with these estimators could deteriorate significantly when Phase-I data are contaminated by outliers. To overcome this issue, two robust methods using trimming and truncation were recently developed, whereas these methods suffer from loss of data information and lack of efficiency due to the data trimming and truncation for achieving robustness property. In this paper, we avoid the pitfall of existing methods using the asymptotically fully efficient statistical minimum distance method. We develop novel robust g and h control charts based on the generalized Kullback-Leibler divergence. Numerical results show that in terms of the average run length and the relative efficiency, the proposed method outperforms several existing ones, especially when the data contain outliers.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [2] Optimal robust estimates using the Kullback-Leibler divergence
    Yohai, Victor J.
    STATISTICS & PROBABILITY LETTERS, 2008, 78 (13) : 1811 - 1816
  • [3] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [4] Robust Active Stereo Vision Using Kullback-Leibler Divergence
    Wang, Yongchang
    Liu, Kai
    Hao, Qi
    Wang, Xianwang
    Lau, Daniel L.
    Hassebrook, Laurence G.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2012, 34 (03) : 548 - 563
  • [5] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [6] Robust parameter design based on Kullback-Leibler divergence
    Zhou, XiaoJian
    Lin, Dennis K. J.
    Hu, Xuelong
    Jiang, Ting
    COMPUTERS & INDUSTRIAL ENGINEERING, 2019, 135 : 913 - 921
  • [7] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [8] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [9] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [10] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349