Bayesian classification using an entropy prior on mixture models

被引:0
|
作者
Center, JL [1 ]
机构
[1] Creat Res Corp, Andover, MA 01810 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In many classification problems, it is reasonable to base the analysis on a mixture model, A mixture model assumes that each sample is produced by first randomly selecting from a finite collection of data clusters and by then using the chosen cluster distribution to produce the class label and feature vector of the sample. If we know the set of model parameters, then when we observe a feature vector, we can predict the classification. When we do not know the parameters exactly, we must infer the model parameters from a training set of data samples. Taking the Bayesian approach, we want to determine the probability distribution for the parameters given the training data. Then when it comes time to predict the class label, given a feature vector, we integrate over the model parameter distribution. We argue that a good, objective choice for the prior distribution on the model parameters is based on the entropy of each mixture model. We show that this prior regularizes the model fit so that over-fitting the training data has no adverse effects.
引用
收藏
页码:42 / 70
页数:29
相关论文
共 50 条
  • [21] Bayesian body localization using mixture of nonlinear shape models
    Zhang, JY
    Collins, R
    Liu, YX
    [J]. TENTH IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOLS 1 AND 2, PROCEEDINGS, 2005, : 725 - 732
  • [22] Bayesian population pharmacokinetic and pharmacodynamic analyses using mixture models
    Rosner, GL
    Muller, P
    [J]. JOURNAL OF PHARMACOKINETICS AND BIOPHARMACEUTICS, 1997, 25 (02): : 209 - 233
  • [23] Gaussian Mixture Models for Affordance Learning using Bayesian Networks
    Osorio, Pedro
    Bernardino, Alexandre
    Martinez-Cantin, Ruben
    Santos-Victor, Jose
    [J]. IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010,
  • [24] Image Modeling and segmentation using incremental Bayesian mixture models
    Constantinopoulos, Constantinos
    Likas, Aristidis
    [J]. COMPUTER ANALYSIS OF IMAGES AND PATTERNS, PROCEEDINGS, 2007, 4673 : 596 - 603
  • [25] Mixture models for classification
    Celeux, Gilles
    [J]. Advances in Data Analysis, 2007, : 3 - 14
  • [26] Bayesian Variable Selection for Multiclass Classification using Bootstrap Prior Technique
    Olaniran, Oyebayo Ridwan
    Bin Abdullah, Mohd Asrul Affendi
    [J]. AUSTRIAN JOURNAL OF STATISTICS, 2019, 48 (02) : 63 - 72
  • [27] Using landscape characteristics as prior information for Bayesian classification of yellow starthistle
    Shafii, B
    Price, WJ
    Prather, TS
    Lass, LW
    Thill, DC
    [J]. WEED SCIENCE, 2004, 52 (06) : 948 - 953
  • [28] Classification and compression of ICEGS using gaussian mixture models
    Coggins, R
    Jabri, M
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING VII, 1997, : 226 - 235
  • [29] Using Wavelets and Gaussian Mixture Models for Audio Classification
    Chuan, Ching-Hua
    Vasana, Susan
    Asaithambi, Asai
    [J]. 2012 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM), 2012, : 421 - 426
  • [30] Classification of facial images using Gaussian mixture models
    Liao, P
    Gao, W
    Shen, L
    Chen, XL
    Shan, SG
    Zeng, WB
    [J]. ADVANCES IN MUTLIMEDIA INFORMATION PROCESSING - PCM 2001, PROCEEDINGS, 2001, 2195 : 724 - 731