Adjusting mixture weights of Gaussian mixture model via regularized probabilistic latent semantic analysis

被引:0
|
作者
Si, L
Jin, R
机构
[1] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
来源
ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PROCEEDINGS | 2005年 / 3518卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture models, such as Gaussian Mixture Model, have been widely used in many applications for modeling data. Gaussian mixture model (GMM) assumes that data points are generated from a set of Gaussian models with the same set of mixture weights. A natural extension of GMM is the probabilistic latent semantic analysis (PLSA) model, which assigns different mixture weights for each data point. Thus, PLSA is more flexible than the GMM method. However, as a tradeoff, PLSA usually suffers from the overfitting problem. In this paper, we propose a regularized probabilistic latent semantic analysis model (RPLSA), which can properly adjust the amount of model flexibility so that not only the training data can be fit well but also the model is robust to avoid the overfitting problem. We conduct empirical study for the application of speaker identification to show the effectiveness of the new model. The experiment results on the NIST speaker recognition dataset indicate that the RPLSA model outperforms both the GMM and PLSA models substantially. The principle of RPLSA of appropriately adjusting model flexibility can be naturally extended to other applications and other types of mixture models.
引用
收藏
页码:622 / 631
页数:10
相关论文
共 50 条
  • [31] Probabilistic Trajectory Prediction with Gaussian Mixture Models
    Wiest, Juergen
    Hoeffken, Matthias
    Kressel, Ulrich
    Dietmayer, Klaus
    2012 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2012, : 141 - 146
  • [32] Boosting Gaussian Mixture Models via Discriminant Analysis
    Tang, Hao
    Huang, Thomas S.
    19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 2372 - 2375
  • [33] A Probabilistic Semantic Based Mixture Collaborative Filtering
    Weng, Linkai
    Zhang, Yaoxue
    Zhou, Yuezhi
    Yang, Laurance T.
    Tian, Pengwei
    Zhong, Ming
    UBIQUITOUS INTELLIGENCE AND COMPUTING, PROCEEDINGS, 2009, 5585 : 377 - +
  • [34] Mixture of Gaussian regressions model with logistic weights, a penalized maximum likelihood approach
    Montuelle, L.
    Le Pennec, E.
    ELECTRONIC JOURNAL OF STATISTICS, 2014, 8 : 1661 - 1695
  • [35] Gaussian Mixture Modeling with Gaussian Process Latent Variable Models
    Nickisch, Hannes
    Rasmussen, Carl Edward
    PATTERN RECOGNITION, 2010, 6376 : 272 - 282
  • [36] Probabilistic seismic demand analysis based on a Gaussian mixture model and limit state threshold randomness
    Jia D.
    Wu Z.
    He X.
    Zhendong yu Chongji/Journal of Vibration and Shock, 2022, 41 (20): : 225 - 234
  • [37] Latent Dirichlet mixture model
    Chien, Jen-Tzung
    Lee, Chao-Hsi
    Tan, Zheng-Hua
    NEUROCOMPUTING, 2018, 278 : 12 - 22
  • [38] Probabilistic latent semantic analysis
    Hofmann, T
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1999, : 289 - 296
  • [39] An Improved Gaussian Mixture Model
    Gong Dayong
    Wang Zhihua
    INTERNATIONAL CONFERENCE ON GRAPHIC AND IMAGE PROCESSING (ICGIP 2012), 2013, 8768
  • [40] The infinite Gaussian mixture model
    Rasmussen, CE
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 12, 2000, 12 : 554 - 560