Variational Inference with Normalizing Flows

被引:0
|
作者
Rezende, Danilo Jimenez [1 ]
Mohamed, Shakir [1 ]
机构
[1] Google DeepMind, London, England
关键词
DENSITY-ESTIMATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The choice of approximate posterior distribution is one of the core problems in variational inference. Most applications of variational inference employ simple families of posterior approximations in order to allow for efficient inference, focusing on mean-field or other simple structured approximations. This restriction has a significant impact on the quality of inferences made using variational methods. We introduce a new approach for specifying flexible, arbitrarily complex and scalable approximate posterior distributions. Our approximations are distributions constructed through a normalizing flow, whereby a simple initial density is transformed into a more complex one by applying a sequence of invertible transformations until a desired level of complexity is attained. We use this view of normalizing flows to develop categories of finite and infinitesimal flows and provide a unified view of approaches for constructing rich posterior approximations. We demonstrate that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
引用
收藏
页码:1530 / 1538
页数:9
相关论文
共 50 条
  • [31] Graphical Normalizing Flows
    Wehenkel, Antoine
    Louppe, Gilles
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 37 - +
  • [32] Smooth Normalizing Flows
    Koehler, Jonas
    Kraemer, Andreas
    Noe, Frank
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [33] Graph Normalizing Flows
    Liu, Jenny
    Kumar, Aviral
    Ba, Jimmy
    Kiros, Jamie
    Swersky, Kevin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [34] Stochastic Normalizing Flows
    Wu, Hao
    Koehler, Jonas
    Noe, Frank
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [35] HydroFlow: Towards probabilistic electricity demand prediction using variational autoregressive models and normalizing flows
    Zhou, Fan
    Wang, Zhiyuan
    Zhong, Ting
    Trajcevski, Goce
    Khokhar, Ashfaq
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 6833 - 6856
  • [36] NF-VGA: Incorporating Normalizing Flows into Graph Variational Autoencoder for Embedding Attribute Networks
    Shan, Hongyu
    Jin, Di
    Jiao, Pengfei
    Liu, Ziyang
    Li, Bingyi
    Huang, Yuxiao
    [J]. 20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 1244 - 1249
  • [37] Fat-Tailed Variational Inference with Anisotropic Tail Adaptive Flows
    Liang, Feynman
    Hodgkinson, Liam
    Mahoney, Michael W.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [38] Marginal post-processing of Bayesian inference products with normalizing flows and kernel density estimators
    Bevins, Harry T. J.
    Handley, William J.
    Lemos, Pablo
    Sims, Peter H.
    de Lera Acedo, Eloy
    Fialkov, Anastasia
    Alsing, Justin
    [J]. MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2023, 526 (03) : 4613 - 4626
  • [39] Riemannian Continuous Normalizing Flows
    Mathieu, Emile
    Nickel, Maximilian
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [40] NORMALIZING FLOWS FOR INTELLIGENT MANUFACTURING
    Russell, Matthew
    Wang, Peng
    [J]. PROCEEDINGS OF ASME 2023 18TH INTERNATIONAL MANUFACTURING SCIENCE AND ENGINEERING CONFERENCE, MSEC2023, VOL 2, 2023,