Variational inference with Gaussian mixture model and householder flow

被引:16
|
作者
Liu, GuoJun [1 ]
Liu, Yang [1 ]
Guo, MaoZu [2 ]
Li, Peng [1 ]
Li, MingYu [1 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin, Heilongjiang, Peoples R China
[2] Beijing Univ Civil Engn & Architecture, Sch Elect & Informat Engn, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Variational auto-encoder; Gaussian mixture model; Householder flow; Variational inference; DENSITY-ESTIMATION;
D O I
10.1016/j.neunet.2018.10.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The variational auto-encoder (VAE) is a powerful and scalable deep generative model. Under the architecture of VAE, the choice of the approximate posterior distribution is one of the crucial issues, and it has a significant impact on tractability and flexibility of the VAE. Generally, latent variables are assumed to be normally distributed with a diagonal covariance matrix, however, it is not flexible enough to match the true complex posterior distribution. We introduce a novel approach to design a flexible and arbitrarily complex approximate posterior distribution. Unlike VAE, firstly, an initial density is constructed by a Gaussian mixture model, and each component has a diagonal covariance matrix. Then this relatively simple distribution is transformed into a more flexible one by applying a sequence of invertible Householder transformations until the desired complexity has been achieved. Additionally, we also give a detailed theoretical and geometric interpretation of Householder transformations. Lastly, due to this change of approximate posterior distribution, the Kullback-Leibler distance between two mixture densities is required to be calculated, but it has no closed form solution. Therefore, we redefine a new variational lower bound by virtue of its upper bound. Compared with other generative models based on similar VAE architecture, our method achieves new state-of-the-art results on benchmark datasets including MNIST, Fashion-MNIST, Omniglot and Histopathology data a more challenging medical images dataset, the experimental results show that our method can improve the flexibility of posterior distribution more effectively. (c) 2018 Elsevier Ltd. All rights reserved.
引用
收藏
页码:43 / 55
页数:13
相关论文
共 50 条
  • [1] Neural Topic Modeling with Gaussian Mixture Model and Householder Flow
    Zhou, Cangqi
    Xu, Sunyue
    Ban, Hao
    Zhang, Jing
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT II, 2022, 13281 : 417 - 428
  • [2] Variational Learning and Inference Algorithms for Extended Gaussian Mixture Model
    Wei, Xin
    Chen, Jianxin
    Wang, Lei
    Cui, Jingwu
    Zheng, Baoyu
    [J]. 2014 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC), 2014, : 236 - 240
  • [3] Community Embeddings with Bayesian Gaussian Mixture Model and Variational Inference
    Begehr, Anton I. N.
    Panfilov, Peter B.
    [J]. 2022 IEEE 24TH CONFERENCE ON BUSINESS INFORMATICS (CBI 2022), VOL 2, 2022, : 88 - 96
  • [4] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454
  • [5] Variational Inference of Finite Generalized Gaussian Mixture Models
    Amudala, Srikanth
    Ali, Samr
    Najar, Fatma
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2433 - 2439
  • [6] Variational Bayesian inference with Gaussian-mixture approximations
    Zobay, O.
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2014, 8 : 355 - 389
  • [7] Variational Inference for Watson Mixture Model
    Taghia, Jalil
    Leijon, Arne
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (09) : 1886 - 1900
  • [8] Trust-Region Variational Inference with Gaussian Mixture Models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [9] Neural Variational Gaussian Mixture Topic Model
    Tang, Kun
    Huang, Heyan
    Shi, Xuewen
    Mao, Xian-Ling
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (04)
  • [10] Variational Inference of Infinite Generalized Gaussian Mixture Models with Feature Selection
    Amudala, Srikanth
    Ali, Samr
    Bouguila, Nizar
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 120 - 127