Making up the shortages of the Bayes classifier by the maximum mutual information classifier

被引:0
|
作者
Lu, Chenguang [1 ]
Zou, Xiaohui [2 ]
Wang, Wenfeng [3 ]
Chen, Xiaofeng [4 ]
机构
[1] Liaoning Tech Univ, Inst Intelligence Engn & Math, Fuxin, Liaoning, Peoples R China
[2] Peking Univ, Sino Amer Searle Res Ctr, Teacher Teaching Dev Ctr, Beijing, Peoples R China
[3] Chinese Acad Sci, Inst Adv Mfg Technol, CNITECH, Ningbo, Peoples R China
[4] Western Washington Univ, Dept Decis Sci, Bellingham, WA 98225 USA
来源
JOURNAL OF ENGINEERING-JOE | 2020年 / 2020卷 / 13期
关键词
iterative methods; pattern classification; Bayes methods; optimisation; probability; learning (artificial intelligence); maximum likelihood estimation; maximum mutual information classifier; maximum posterior probability criterion; error rate criterion; maximum likelihood criterion; probability events; unseen instance classifications; optimised classifier; probability distribution; transition probability functions; Bayes classifier; MPP criterion; MMI criterion;
D O I
10.1049/joe.2019.1157
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The Bayes classifier is often used because it is simple, and the maximum posterior probability (MPP) criterion it uses is equivalent to the least error rate criterion. However, it has issues in the following circumstances: (i) if information instead of correctness is more important, we should use the maximum likelihood criterion or maximum information criterion, which can reduce the rate of failure to report small probability events. (ii) For unseen instance classifications, the previously optimised classifier cannot be properly used when the probability distribution of true classes is changed. (iii) When classes' feature distributions instead of transition probability functions (TPFs) are stable, it is improper to train the TPF, such as the logistic function, with parameters. (iv) For multi-label classifications, it is difficult to optimise a group of TPFs with parameters that the Bayes classifier needs. This study addresses these issues by comparing the MPP criterion with the maximum likelihood criterion and maximum mutual information (MMI) criterion. It suggests using the MMI criterion for most unseen instance classifications. It presents a new iterative algorithm, the channel matching (CM) algorithm, for the MMI classification. It uses two examples to show the advantages of the CM algorithm: fast and reliable.
引用
收藏
页码:659 / 663
页数:5
相关论文
共 50 条
  • [1] Imprecise information in Bayes classifier
    Burduk, Robert
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2012, 15 (02) : 147 - 153
  • [2] Imprecise information in Bayes classifier
    Robert Burduk
    [J]. Pattern Analysis and Applications, 2012, 15 : 147 - 153
  • [3] A method to speed up the Bayes classifier
    Leung, CH
    Sze, L
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 1998, 11 (03) : 419 - 424
  • [4] A Fair Classifier Using Mutual Information
    Cho, Jaewoong
    Hwang, Gyeongjo
    Suh, Changho
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 2521 - 2526
  • [5] Constructing fast linear classifier with mutual information
    Shi, ZW
    Shi, ZZ
    [J]. PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND BRAIN, VOLS 1-3, 2005, : 1611 - 1615
  • [6] HIERARCHICAL CLASSIFIER DESIGN USING MUTUAL INFORMATION
    SETHI, IK
    SARVARAYUDU, GPR
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1982, 4 (04) : 441 - 445
  • [7] MILCS: A Mutual Information Learning Classifier System
    Smith, R. E.
    Jiang, Max Kun
    [J]. GECCO 2007: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2007, : 1877 - 1877
  • [8] ON THE MULTISTAGE BAYES CLASSIFIER
    KURZYNSKI, MW
    [J]. PATTERN RECOGNITION, 1988, 21 (04) : 355 - 365
  • [9] Bayes' Theorem Based Maritime Safety Information Classifier
    Liu, Hongze
    Liu, Zhengjiang
    Wang, Xin
    Cai, Yao
    [J]. PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 2725 - 2729
  • [10] A Novel Mutual Information based Ant Colony Classifier
    Yu, Hang
    Qian, Xiaoxiao
    Yu, Yang
    Cheng, Jiujun
    Yu, Ying
    Gao, Shangce
    [J]. PROCEEDINGS OF 2017 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC 2017), 2017, : 61 - 65