Variational Bayesian multinomial probit regression with gaussian process priors

被引:141
|
作者
Girolami, Mark [1 ]
Rogers, Simon [1 ]
机构
[1] Univ Glasgow, Dept Comp Sci, Glasgow G12 8QQ, Lanark, Scotland
关键词
D O I
10.1162/neco.2006.18.8.1790
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is well known in the statistics literature that augmenting binary and polychotomous response models with gaussian latent variables enables exact Bayesian analysis via Gibbs sampling from the parameter posterior. By adopting such a data augmentation strategy, dispensing with priors over regression coefficients in favor of gaussian process (GP) priors over functions, and employing variational approximations to the full posterior, we obtain efficient computational methods for GP classification in the multiclass setting.(1) The model augmentation with additional latent variables ensures full a posteriori class coupling while retaining the simple a priori independent GP covariance structure from which sparse approximations, such as multiclass informative vector machines (IVM), emerge in a natural and straightforward manner. This is the first time that a fully variational Bayesian treatment for multiclass GP classification has been developed without having to resort to additional explicit approximations to the nongaussian likelihood term. Empirical comparisons with exact analysis use Markov Chain Monte Carlo (MCMC) and Laplace approximations illustrate the utility of the variational approximation as a computationally economic alternative to full MCMC and it is shown to be more accurate than the Laplace approximation.
引用
收藏
页码:1790 / 1817
页数:28
相关论文
共 50 条
  • [1] Variational Bayesian multinomial probit model with Gaussian process classification on mice protein expression level data
    Son, Donghyun
    Hwang, Beom Seuk
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2023, 36 (02)
  • [2] Variational Inference on Infinite Mixtures of Inverse Gaussian, Multinomial Probit and Exponential Regression
    Sk, Minhazul Islam
    Banerjee, Arunava
    [J]. 2014 13TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2014, : 276 - 281
  • [3] Variational Bayesian multinomial logistic Gaussian process classification
    Cho, Wanhyun
    Na, Inseop
    Kim, Sangkyoon
    Park, Soonyoung
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (14) : 18563 - 18582
  • [4] Variational Bayesian multinomial logistic Gaussian process classification
    Wanhyun Cho
    Inseop Na
    Sangkyoon Kim
    Soonyoung Park
    [J]. Multimedia Tools and Applications, 2018, 77 : 18563 - 18582
  • [5] Multinomial probit Bayesian additive regression trees
    Kindo, Bereket P.
    Wang, Hao
    Pena, Edsel A.
    [J]. STAT, 2016, 5 (01): : 119 - 131
  • [6] vbmp: Variational Bayesian multinomial probit regression for multi-class classification in R
    Lama, Nicola
    Girolami, Mark
    [J]. BIOINFORMATICS, 2008, 24 (01) : 135 - 136
  • [7] A variational Bayes approach to a semiparametric regression using Gaussian process priors
    Ong, Victor M. H.
    Mensah, David K.
    Nott, David J.
    Jo, Seongil
    Park, Beomjo
    Choi, Taeryon
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2017, 11 (02): : 4258 - 4296
  • [8] Variational multinomial logit gaussian process
    Chai, Kian Ming A.
    [J]. Journal of Machine Learning Research, 2012, 13 : 1745 - 1808
  • [9] Variational Multinomial Logit Gaussian Process
    Chai, Kian Ming A.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 1745 - 1808
  • [10] Augmentation Samplers for Multinomial Probit Bayesian Additive Regression Trees
    Xu, Yizhen
    Hogan, Joseph
    Daniels, Michael
    Kantor, Rami
    Mwangi, Ann
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2024,