Mixtures of probabilistic principal component analyzers

被引:1185
|
作者
Tipping, ME [1 ]
Bishop, CM [1 ]
机构
[1] Microsoft Res, Cambridge CB2 3NH, England
关键词
D O I
10.1162/089976699300016728
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a probability density, and so there is no unique way to combine PCA models. Therefore, previous attempts to formulate mixture models for PCA have been ad hoc to some extent. In this article, PCA is formulated within a maximum likelihood framework, based on a specific form of gaussian latent variable model. This leads to a well-defined mixture model for probabilistic principal component analyzers, whose parameters can be determined using an expectation-maximization algorithm. We discuss the advantages of this model in the context of clustering, density modeling, and local dimensionality reduction, and we demonstrate its application to image compression and handwritten digit recognition.
引用
收藏
页码:443 / 482
页数:40
相关论文
共 50 条
  • [1] Mixtures of robust probabilistic principal component analyzers
    Archambeau, Cedric
    Delannay, Nicolas
    Verleysen, Michel
    [J]. NEUROCOMPUTING, 2008, 71 (7-9) : 1274 - 1282
  • [2] Mixtures of principal component analyzers
    Tipping, ME
    Bishop, CM
    [J]. FIFTH INTERNATIONAL CONFERENCE ON ARTIFICIAL NEURAL NETWORKS, 1997, (440): : 13 - 18
  • [3] Probabilistic Distance for Mixtures of Independent Component Analyzers
    Safont, Gonzalo
    Salazar, Addisson
    Vergara, Luis
    Gomez, Enriqueta
    Villanueva, Vicente
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (04) : 1161 - 1173
  • [4] DIRECT IMPORTANCE ESTIMATION WITH PROBABILISTIC PRINCIPAL COMPONENT ANALYZERS
    Yamada, Makoto
    Sugiyama, Masashi
    Wichern, Gordon
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 1962 - 1965
  • [5] Texture segmentation using the mixtures of principal component analyzers
    Musa, MEM
    Duin, RPW
    de Ridder, D
    Atalay, V
    [J]. COMPUTER AND INFORMATION SCIENCES - ISCIS 2003, 2003, 2869 : 505 - 512
  • [6] Almost autonomous training of mixtures of principal component analyzers
    Musa, MEM
    de Ridder, D
    Duin, RPW
    Atalay, V
    [J]. PATTERN RECOGNITION LETTERS, 2004, 25 (09) : 1085 - 1095
  • [7] A low-cost variational-Bayes technique for merging mixtures of probabilistic principal component analyzers
    Bruneau, Pierrick
    Gelgon, Marc
    Picarougne, Fabien
    [J]. INFORMATION FUSION, 2013, 14 (03) : 268 - 280
  • [8] Direct Importance Estimation with a Mixture of Probabilistic Principal Component Analyzers
    Yamada, Makoto
    Sugiyama, Masashi
    Wichern, Gordon
    Simm, Jaak
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2010, E93D (10) : 2846 - 2849
  • [9] Mixture of Probabilistic Principal Component Analyzers for Shapes from Point Sets
    Gooya, Ali
    Lekadir, Karim
    Castro-Mateos, Isaac
    Pozo, Jose Maria
    Frangi, Alejandro F.
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (04) : 891 - 904
  • [10] Coordinating Principal Component Analyzers
    Verbeek, JJ
    Vlassis, N
    Kröse, B
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 914 - 919