Discretisation for inference on normal mixture models

被引:17
|
作者
Brewer, MJ [1 ]
机构
[1] Macaulay Inst, Aberdeen AB15 8QH, Scotland
关键词
normal mixture models; discretisation of continuous parameters; Bayesian inference;
D O I
10.1023/A:1024214615828
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The problem of inference in Bayesian Normal mixture models is known to be difficult. In particular, direct Bayesian inference (via quadrature) suffers from a combinatorial explosion in having to consider every possible partition of n observations into k mixture components, resulting in a computation time which is O(k(n)). This paper explores the use of discretised parameters and shows that for equal-variance mixture models, direct computation time can be reduced to O(D(k)n(k)), where relevant continuous parameters are each divided into D regions. As a consequence, direct inference is now possible on genuine data sets for small k, where the quality of approximation is determined by the level of discretisation. For large problems, where the computational complexity is still too great in O(D(k)n(k)) time, discretisation can provide a convergence diagnostic for a Markov chain Monte Carlo analysis.
引用
收藏
页码:209 / 219
页数:11
相关论文
共 50 条
  • [11] Distributed Inference for Dirichlet Process Mixture Models
    Ge, Hong
    Chen, Yutian
    Wan, Moquan
    Ghahramani, Zoubin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 2276 - 2284
  • [12] Efficient Bayesian inference for dynamic mixture models
    Gerlach, R
    Carter, C
    Kohn, R
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2000, 95 (451) : 819 - 828
  • [13] Inference for multivariate normal hierarchical models
    Everson, PJ
    Morris, CN
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2000, 62 : 399 - 412
  • [14] Extreme Stochastic Variational Inference: Distributed Inference for Large Scale Mixture Models
    Zhang, Jiong
    Raman, Parameswaran
    Ji, Shihao
    Yu, Hsiang-Fu
    Vishwanathan, S. V. N.
    Dhillon, Inderjit S.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 935 - 943
  • [15] Root selection in normal mixture models
    Seo, Byungtae
    Kim, Daeyoung
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (08) : 2454 - 2470
  • [16] Statistical inference for mixture GARCH models with financial application
    Cavicchioli, Maddalena
    COMPUTATIONAL STATISTICS, 2021, 36 (04) : 2615 - 2642
  • [17] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454
  • [18] Adaptative Inference Cost With Convolutional Neural Mixture Models
    Ruiz, Adria
    Verbeek, Jakob
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 1872 - 1881
  • [19] Inference in mixed models with a mixture of distributions and controlled heteroscedasticity
    Ferreira, Dario
    Ferreira, Sandra S.
    Antunes, Patricia
    Oliveira, Teresa A.
    Mexia, Joao T.
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2024,
  • [20] Statistical inference on group Rasch mixture network models
    Long, Yuhang
    Huang, Tao
    STAT, 2022, 11 (01):