Asymptotic Normality for Plug-In Estimators of Generalized Shannon's Entropy

被引:3
|
作者
Zhang, Jialin [1 ]
Shi, Jingyi [1 ]
机构
[1] Mississippi State Univ, Dept Math & Stat, Mississippi State, MS 39762 USA
关键词
Shannon's entropy; generalized Shannon's entropy; plug-in estimation; asymptotic normality; INFORMATION; LAW;
D O I
10.3390/e24050683
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Shannon's entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon's entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon's entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon's entropy. The asymptotic distribution for Shannon's entropy's plug-in estimator was well studied in the existing literature. This paper studies the asymptotic properties for the plug-in estimator of generalized Shannon's entropy on countable alphabets. The developed asymptotic properties require no assumptions on the original distribution. The proposed asymptotic properties allow for interval estimation and statistical tests with generalized Shannon's entropy.
引用
收藏
页数:10
相关论文
共 50 条