Some properties and applications of cumulative Kullback-Leibler information

被引:32
|
作者
Di Crescenzo, Antonio [1 ]
Longobardi, Maria [2 ]
机构
[1] Univ Salerno, Dipartimento Matemat, I-84084 Fisciano, SA, Italy
[2] Univ Naples Federico II, Dipartimento Matemat & Applicaz, I-80126 Naples, Italy
关键词
cumulative entropy; cumulative residual entropy; proportional reversed hazard model; relative aging; stochastic order; PROPORTIONAL REVERSED HAZARD; LIFETIME; ENTROPY; DISCRIMINATION; RELIABILITY; DIVERGENCE;
D O I
10.1002/asmb.2116
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
The cumulative Kullback-Leibler information has been proposed recently as a suitable extension of Kullback-Leibler information to the cumulative distribution function. In this paper, we obtain various results on such a measure, with reference to its relation with other information measures and notions of reliability theory. We also provide some lower and upper bounds. Adynamic version of the cumulativeKullback-Leibler information is then proposed for past lifetimes. Furthermore, we investigate its monotonicity property, which is related to some new concepts of relative aging. Moreover, we propose an application to the failure of nanocomponents. Finally, in order to provide an application in image analysis, we introduce the empirical cumulative Kullback-Leibler information and prove an asymptotic result. Copyright (c) 2015 John Wiley & Sons, Ltd.
引用
收藏
页码:875 / 891
页数:17
相关论文
共 50 条
  • [31] A test for population collinearity - A Kullback-Leibler information approach
    GuerreroCusumano, JL
    [J]. INFORMATION SCIENCES, 1996, 92 (1-4) : 295 - 311
  • [32] Kullback-Leibler Information Consistent Estimation for Censored Data
    Akio Suzukawa
    Hideyuki Imai
    Yoshiharu Sato
    [J]. Annals of the Institute of Statistical Mathematics, 2001, 53 : 262 - 276
  • [33] Integrating information by Kullback-Leibler constraint for text classification
    Yin, Shu
    Zhu, Peican
    Wu, Xinyu
    Huang, Jiajin
    Li, Xianghua
    Wang, Zhen
    Gao, Chao
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (24): : 17521 - 17535
  • [34] MONOTONICITY OF THE FISHER INFORMATION AND THE KULLBACK-LEIBLER DIVERGENCE MEASURE
    RYU, KW
    [J]. ECONOMICS LETTERS, 1993, 42 (2-3) : 121 - 128
  • [35] Kullback-Leibler information consistent estimation for censored data
    Suzukawa, A
    Imai, H
    Sato, Y
    [J]. ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2001, 53 (02) : 262 - 276
  • [36] KULLBACK-LEIBLER INFORMATION AND ITS APPLICATIONS IN MULTI-DIMENSIONAL ADAPTIVE TESTING
    Wang, Chun
    Chang, Hua-Hua
    Boughton, Keith A.
    [J]. PSYCHOMETRIKA, 2011, 76 (01) : 13 - 39
  • [37] Kullback-Leibler and relative Fisher information as descriptors of locality
    Levamaki, Henrik
    Nagy, Agnes
    Vilja, Iiro
    Kokko, Kalevi
    Vitos, Levente
    [J]. INTERNATIONAL JOURNAL OF QUANTUM CHEMISTRY, 2018, 118 (12)
  • [38] ASYMPTOTIC PROPERTIES OF NEYMAN-PEARSON TESTS FOR INFINITE KULLBACK-LEIBLER INFORMATION
    JANSSEN, A
    [J]. ANNALS OF STATISTICS, 1986, 14 (03): : 1068 - 1079
  • [39] Chained Kullback-Leibler Divergences
    Pavlichin, Dmitri S.
    Weissman, Tsachy
    [J]. 2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 580 - 584
  • [40] Bootstrap estimate of Kullback-Leibler information for model selection
    Shibata, R
    [J]. STATISTICA SINICA, 1997, 7 (02) : 375 - 394