Adjusting mixture weights of Gaussian mixture model via regularized probabilistic latent semantic analysis

被引:0
|
作者
Si, L
Jin, R
机构
[1] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
来源
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS | 2005年 / 3518卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture models, such as Gaussian Mixture Model, have been widely used in many applications for modeling data. Gaussian mixture model (GMM) assumes that data points are generated from a set of Gaussian models with the same set of mixture weights. A natural extension of GMM is the probabilistic latent semantic analysis (PLSA) model, which assigns different mixture weights for each data point. Thus, PLSA is more flexible than the GMM method. However, as a tradeoff, PLSA usually suffers from the overfitting problem. In this paper, we propose a regularized probabilistic latent semantic analysis model (RPLSA), which can properly adjust the amount of model flexibility so that not only the training data can be fit well but also the model is robust to avoid the overfitting problem. We conduct empirical study for the application of speaker identification to show the effectiveness of the new model. The experiment results on the NIST speaker recognition dataset indicate that the RPLSA model outperforms both the GMM and PLSA models substantially. The principle of RPLSA of appropriately adjusting model flexibility can be naturally extended to other applications and other types of mixture models.
引用
收藏
页码:622 / 631
页数:10
相关论文
共 50 条
  • [41] A Hierarchical Latent Mixture Model for Polyphonic Music Analysis
    O'Brien, Cian
    Plumbley, Mark D.
    2018 26TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2018, : 1910 - 1914
  • [42] Regularized Gaussian Mixture Model based discretization for gene expression data association mining
    Cai, Ruichu
    Hao, Zhifeng
    Wen, Wen
    Wang, Lijuan
    APPLIED INTELLIGENCE, 2013, 39 (03) : 607 - 613
  • [43] IMAGE RESTORATION VIA EFFICIENT GAUSSIAN MIXTURE MODEL LEARNING
    Feng, Jianzhou
    Song, Li
    Huo, Xiaoming
    Yang, Xiaokang
    Zhang, Wenjun
    2013 20TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP 2013), 2013, : 1056 - 1060
  • [44] A gradient entropy regularized likelihood learning algorithm on Gaussian mixture with automatic model selection
    Lu, Zhiwu
    Ma, Jinwen
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 464 - 469
  • [45] Regularized Gaussian Mixture Model based discretization for gene expression data association mining
    Ruichu Cai
    Zhifeng Hao
    Wen Wen
    Lijuan Wang
    Applied Intelligence, 2013, 39 : 607 - 613
  • [46] An iterative algorithm for entropy regularized likelihood learning on Gaussian mixture with automatic model selection
    Lu, Zhiwu
    NEUROCOMPUTING, 2006, 69 (13-15) : 1674 - 1677
  • [47] Hierarchical Gaussian Mixture Model for Image Annotation via PLSA
    Wang, Zhiyong
    Yi, Huaibin
    Wang, Jiajun
    Feng, Dagan
    PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON IMAGE AND GRAPHICS (ICIG 2009), 2009, : 384 - 389
  • [48] Learning mixture models with the regularized latent maximum entropy principle
    Wang, SJ
    Schuurmans, D
    Peng, FC
    Zhao, YX
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2004, 15 (04): : 903 - 916
  • [49] AUGMENTED LATENT DIRICHLET ALLOCATION (LDA) TOPIC MODEL WITH GAUSSIAN MIXTURE TOPICS
    Prabhudesai, Kedar S.
    Mainsah, Boyla O.
    Collins, Leslie M.
    Throckmorton, Chandra S.
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 2451 - 2455
  • [50] Process monitoring using a Gaussian mixture model via principal component analysis and discriminant analysis
    Choi, SW
    Park, JH
    Lee, IB
    COMPUTERS & CHEMICAL ENGINEERING, 2004, 28 (08) : 1377 - 1387