Variational Inference with Normalizing Flows

被引:0
|
作者
Rezende, Danilo Jimenez [1 ]
Mohamed, Shakir [1 ]
机构
[1] Google DeepMind, London, England
关键词
DENSITY-ESTIMATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The choice of approximate posterior distribution is one of the core problems in variational inference. Most applications of variational inference employ simple families of posterior approximations in order to allow for efficient inference, focusing on mean-field or other simple structured approximations. This restriction has a significant impact on the quality of inferences made using variational methods. We introduce a new approach for specifying flexible, arbitrarily complex and scalable approximate posterior distributions. Our approximations are distributions constructed through a normalizing flow, whereby a simple initial density is transformed into a more complex one by applying a sequence of invertible transformations until a desired level of complexity is attained. We use this view of normalizing flows to develop categories of finite and infinitesimal flows and provide a unified view of approaches for constructing rich posterior approximations. We demonstrate that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
引用
收藏
页码:1530 / 1538
页数:9
相关论文
共 50 条
  • [1] Sylvester Normalizing Flows for Variational Inference
    van den Berg, Rianne
    Hasenclever, Leonard
    Tomczak, Jakub M.
    Welling, Max
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 393 - 402
  • [2] Improved Variational Bayesian Phylogenetic Inference with Normalizing Flows
    Zhang, Cheng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] To Sample or Not to Sample: Retrieving Exoplanetary Spectra with Variational Inference and Normalizing Flows
    Yip, Kai Hou
    Changeat, Quentin
    Al-Refaie, Ahmed
    Waldmann, Ingo P.
    [J]. ASTROPHYSICAL JOURNAL, 2024, 961 (01):
  • [4] Variational Inference MPC using Normalizing Flows and Out-of-Distribution Projection
    Power, Thomas
    Berenson, Dmitry
    [J]. Robotics: Science and Systems, 2022,
  • [5] Variational Inference MPC using Normalizing Flows and Out-of-Distribution Projection
    Power, Thomas
    Berenson, Dmitry
    [J]. ROBOTICS: SCIENCE AND SYSTEM XVIII, 2022,
  • [6] VARIATIONAL DIALOGUE GENERATION WITH NORMALIZING FLOWS
    Luo, Tien-Ching
    Chien, Jen-Tzung
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7778 - 7782
  • [7] Normalizing Flows for Probabilistic Modeling and Inference
    Papamakarios, George
    Nalisnick, Eric
    Rezende, Danilo Jimenez
    Mohamed, Shakir
    Lakshminarayanan, Balaji
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [8] Normalizing flows for probabilistic modeling and inference
    Papamakarios, George
    Nalisnick, Eric
    Rezende, Danilo Jimenez
    Mohamed, Shakir
    Lakshminarayanan, Balaji
    [J]. 2021, Microtome Publishing (22)
  • [9] Multiplicative Normalizing Flows for Variational Bayesian Neural Networks
    Louizos, Christos
    Welling, Max
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [10] Variational inference in neural functional prior using normalizing flows: application to differential equation and operator learning problems
    Xuhui Meng
    [J]. Applied Mathematics and Mechanics, 2023, 44 : 1111 - 1124