A new histogram-based estimation technique of entropy and mutual information using mean squared error minimization

被引:18
|
作者
Hacine-Gharbi, A. [1 ,4 ]
Deriche, M. [2 ]
Ravier, P. [1 ]
Harba, R. [1 ]
Mohamadi, T. [3 ]
机构
[1] Univ Orleans, F-45067 Orleans, France
[2] King Fahd Univ Petr & Minerals, Dhahran, Saudi Arabia
[3] Setif Univ, Setif, Algeria
[4] Bordj Bou Arreridj Univ, Bordj Bou Arreridj, Algeria
关键词
FEATURE-SELECTION; CLASSIFICATION; RECOGNITION;
D O I
10.1016/j.compeleceng.2013.02.010
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Mutual Information (MI) has extensively been used as a measure of similarity or dependence between random variables (or parameters) in different signal and image processing applications. However, MI estimation techniques are known to exhibit a large bias, a high Mean Squared Error (MSE), and can computationally be very costly. In order to overcome these drawbacks, we propose here a novel fast and low MSE histogram-based estimation technique for the computation of entropy and the mutual information. By minimizing the MSE, the estimation avoids the error accumulation problem of traditional methods. We derive an expression for the optimal number of bins to estimate the MI for both continuous and discrete random variables. Experimental results from a speech recognition problem and a computer aided diagnosis problem show the power of the proposed approach in estimating the optimal number of selected features with enhanced classification results compared to existing approaches. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:918 / 933
页数:16
相关论文
共 50 条