Some Results of Extropy-Based Measure of Divergence for Record Statistics

被引:0
|
作者
Goel, Ritu [1 ]
Sharma, Salook [2 ]
Kumar, Vikas [2 ]
机构
[1] NFSU, Dept Cyber Secur & Digital Forens, Delhi, India
[2] Maharshi Dayanand Univ, Dept Appl Sci, UIET, Rohtak, India
来源
关键词
K-L Information Measure; Extropy Measure; Divergence-Extropy Measure; Order Statistics; ENTROPY PROPERTIES; PROBABILITY;
D O I
10.22034/jirss.2024.2026495.1057
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
It is interesting to learn that the complementary dual of the Shannon entropy measure exists and has some common properties. This new measure of uncertainty has been introduced by Lad et al. (2015) and is known as extropy. Although there are some mathematical analogies between the two measures, extropy typically has different uses and interpretations than entropy. Taking into account the importance of extropy measure, and its various generalizations, in the present communication, we consider and study Kullback-Leibler based "divergence-extropy" measure between problems for the proposed "divergence-extropy" measure have been studied. Further, some specific lifetime distributions used in lifetime testing, physical sciences, survival analysis and reliability engineering have been studied using the proposed "divergenceextropy" measure. At the end, we study the proposed "divergence-extropy" measure between the distribution of k-record value and order statistics.
引用
收藏
页码:83 / 98
页数:16
相关论文
共 50 条