Adjusting mixture weights of Gaussian mixture model via regularized probabilistic latent semantic analysis

被引:0
|
作者
Si, L
Jin, R
机构
[1] Carnegie Mellon Univ, Sch Comp Sci, Pittsburgh, PA 15213 USA
[2] Michigan State Univ, Dept Comp Sci & Engn, E Lansing, MI 48824 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture models, such as Gaussian Mixture Model, have been widely used in many applications for modeling data. Gaussian mixture model (GMM) assumes that data points are generated from a set of Gaussian models with the same set of mixture weights. A natural extension of GMM is the probabilistic latent semantic analysis (PLSA) model, which assigns different mixture weights for each data point. Thus, PLSA is more flexible than the GMM method. However, as a tradeoff, PLSA usually suffers from the overfitting problem. In this paper, we propose a regularized probabilistic latent semantic analysis model (RPLSA), which can properly adjust the amount of model flexibility so that not only the training data can be fit well but also the model is robust to avoid the overfitting problem. We conduct empirical study for the application of speaker identification to show the effectiveness of the new model. The experiment results on the NIST speaker recognition dataset indicate that the RPLSA model outperforms both the GMM and PLSA models substantially. The principle of RPLSA of appropriately adjusting model flexibility can be naturally extended to other applications and other types of mixture models.
引用
收藏
页码:622 / 631
页数:10
相关论文
共 50 条
  • [1] LATENT VARIABLE SPEAKER ADAPTATION OF GAUSSIAN MIXTURE WEIGHTS AND MEANS
    Zhang, Xueru
    Demuynck, Kris
    Van Hamme, Hugo
    2012 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2012, : 4349 - 4352
  • [2] Regularized Probabilistic Latent Semantic Analysis with Continuous Observations
    Zhang, Hao
    Edwards, Richard
    Parker, Lynne
    2012 11TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2012), VOL 1, 2012, : 560 - 563
  • [3] Laplacian Regularized Gaussian Mixture Model for Data Clustering
    He, Xiaofei
    Cai, Deng
    Shao, Yuanlong
    Bao, Hujun
    Han, Jiawei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2011, 23 (09) : 1406 - 1418
  • [4] Manifold regularized semi-supervised Gaussian mixture model
    Gan, Haitao
    Sang, Nong
    Huang, Rui
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 2015, 32 (04) : 566 - 575
  • [5] Regularized Gaussian Mixture Model for High-Dimensional Clustering
    Zhao, Yang
    Shrivastava, Abhishek K.
    Tsui, Kwok Leung
    IEEE TRANSACTIONS ON CYBERNETICS, 2019, 49 (10) : 3677 - 3688
  • [6] Probabilistic Point Set Matching with Gaussian Mixture Model
    Qu, Han-Bing
    Wang, Jia-Qiang
    Li, Bin
    Yue, Feng
    Jin, Wei
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 2100 - 2107
  • [7] Supervised Gaussian Process Latent Variable Model Based on Gaussian Mixture Model
    Zhang, Jiayuan
    Zhu, Ziqi
    Zou, Jixin
    2017 INTERNATIONAL CONFERENCE ON SECURITY, PATTERN ANALYSIS, AND CYBERNETICS (SPAC), 2017, : 124 - 129
  • [8] Gaussian Mixture Model with Semantic Distance for Image Classification
    Wu, Wei
    Gao, Guanglai
    Nie, Jianyun
    26TH CHINESE CONTROL AND DECISION CONFERENCE (2014 CCDC), 2014, : 1687 - 1691
  • [9] Semantic principal video shot classification via mixture Gaussian
    Luo, HZ
    Fan, JP
    Xiao, J
    Zhu, XQ
    2003 INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOL II, PROCEEDINGS, 2003, : 189 - 192
  • [10] Simplifying Gaussian Mixture Model via Model Similarity
    Wan, Yuchai
    Liu, Xiabi
    Tang, Yuyang
    2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 3180 - 3185