Minimum Mean Squared Error Estimation and Mutual Information Gain

被引:0
|
作者
Gibson, Jerry [1 ]
机构
[1] Univ Calif Santa Barbara, Dept Elect & Comp Engn, Santa Barbara, CA 93106 USA
关键词
mutual information gain; entropy power; minimum mean squared error estimation;
D O I
10.3390/info15080497
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information theoretic quantities such as entropy, entropy rate, information gain, and relative entropy are often used to understand the performance of intelligent agents in learning applications. Mean squared error has not played a role in these analyses, primarily because it is not felt to be a viable performance indicator in these scenarios. We build on a new quantity, the log ratio of entropy powers, to establish that minimum mean squared error (MMSE) estimation, prediction, and smoothing are directly connected to mutual information gain or loss in an agent learning system modeled by a Markov chain for many probability distributions of interest. Expressions for mutual information gain or loss are developed for MMSE estimation, prediction, and smoothing, and an example for fixed lag smoothing is presented.
引用
下载
收藏
页数:14
相关论文
共 50 条