Improved binary classification performance using an information theoretic criterion

被引:0
|
作者
Burrascano, P [1 ]
Pirollo, D [1 ]
机构
[1] UNIV ROMA LA SAPIENZA, DIPARTIMENTO INFORMAT, I-00184 ROME, ITALY
关键词
feedforward neural networks; classification; Kullback-Leibler distance;
D O I
10.1016/0925-2312(96)00025-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feedforward neural networks trained to solve classification problems define an approximation of the conditional probabilities P(C-i\x) if the output units correspond to categories C-i. The present paper shows that if a least mean squared error cost function is minimised during training phase, the resulting approximation of the P(C-i\x)s is poor in the ranges of the input variable x where the conditional probabilities take on very low values. The use of the Kullback-Leibler distance measure is proposed to overcome this limitation; a cost function derived from this information theoretic measure is defined and a computationally light training procedure is derived in the case of binary classification problems. The effectiveness of the proposed procedure is verified by means of comparative experiments.
引用
收藏
页码:375 / 383
页数:9
相关论文
共 50 条
  • [11] Probability model selection using information-theoretic optimization criterion
    Sy, BK
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2001, 69 (03) : 203 - 224
  • [12] Information-theoretic criterion for the performance of single-photon avalanche photodiodes
    Ramirez, DA
    Hayat, MM
    Torres, SN
    Saleh, BEA
    Teich, MC
    [J]. IEEE PHOTONICS TECHNOLOGY LETTERS, 2005, 17 (10) : 2164 - 2166
  • [13] Information Theoretic Criterion for Stopping Turbo Iteration
    Huang, Lei
    Zhang, Q. T.
    Cheng, L. L.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2011, 59 (02) : 848 - 853
  • [14] Classification of Complex Diseases using an Improved Binary Cuckoo Search and Conditional Mutual Information Maximization
    Phogat, Manu
    Kumar, Dharmender
    [J]. COMPUTACION Y SISTEMAS, 2020, 24 (03): : 1121 - 1129
  • [15] Automatic Modulation Classification using Information Theoretic Similarity Measures
    Fontes, Aluisio I. R.
    Pasa, Leandro A.
    de Sousa, Vicente A., Jr.
    Costa, Jose A. F.
    Silveira, Luiz F. Q.
    Abinader, Fuad M., Jr.
    [J]. 2012 IEEE VEHICULAR TECHNOLOGY CONFERENCE (VTC FALL), 2012,
  • [16] DETECTION OF THE NUMBERS OF OUTLIERS PRESENT IN A DATA SET USING AN INFORMATION THEORETIC CRITERION
    BHANDARY, M
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1992, 21 (11) : 3263 - 3274
  • [17] Fuzzy classification using information theoretic learning vector quantization
    Villmann, Thomas
    Hammer, Barbara
    Schleif, Frank-Michael
    Hermann, Wieland
    Cottrell, Marie
    [J]. NEUROCOMPUTING, 2008, 71 (16-18) : 3070 - 3076
  • [18] Peptide classification using optimal and information theoretic syntactic modeling
    Aygun, E.
    Oommen, B. J.
    Cataltepe, Z.
    [J]. PATTERN RECOGNITION, 2010, 43 (11) : 3891 - 3899
  • [19] Nonlinear channel equalization using multilayer perceptrons with information-theoretic criterion
    Erdogmus, D
    Rende, D
    Principe, JC
    Wong, TF
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING XI, 2001, : 443 - 451
  • [20] Echo cancellation by global optimization of Kautz filters using an information theoretic criterion
    Lai, CA
    Erdogmus, D
    Principe, JC
    [J]. 2003 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL VI, PROCEEDINGS: SIGNAL PROCESSING THEORY AND METHODS, 2003, : 197 - 200