Empirical Estimation of Information Measures: A Literature Guide

被引:33
|
作者
Verdu, Sergio
机构
[1] Princeton, 08540, NJ
基金
美国国家科学基金会;
关键词
information measures; empirical estimators; entropy; relative entropy; mutual information; universal estimation; DIVERGENCE ESTIMATION; ENTROPY ESTIMATION; NONPARAMETRIC-ESTIMATION; CONVERGENCE PROPERTIES; MUTUAL INFORMATION; RELATIVE ENTROPY; FUNCTIONALS; DISTRIBUTIONS; CLASSIFICATION; DENSITIES;
D O I
10.3390/e21080720
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We give a brief survey of the literature on the empirical estimation of entropy, differential entropy, relative entropy, mutual information and related information measures. While those quantities are of central importance in information theory, universal algorithms for their estimation are increasingly important in data science, machine learning, biology, neuroscience, economics, language, and other experimental sciences.
引用
收藏
页数:16
相关论文
共 50 条