Probabilistic Circuits for Variational Inference in Discrete Graphical Models

被引:0
|
作者
Shih, Andy [1 ]
Ermon, Stefano [1 ]
机构
[1] Stanford Univ, Dept Comp Sci, Stanford, CA 94305 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Inference in discrete graphical models with variational methods is difficult because of the inability to re-parameterize gradients of the Evidence Lower Bound (ELBO). Many sampling-based methods have been proposed for estimating these gradients, but they suffer from high bias or variance. In this paper, we propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN), to compute ELBO gradients exactly (without sampling) for a certain class of densities. In particular, we show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is a polynomial the corresponding ELBO can be computed analytically. To scale to graphical models with thousands of variables, we develop an efficient and effective construction of selective-SPNs with size O(kn), where n is the number of variables and k is an adjustable hyperparameter. We demonstrate our approach on three types of graphical models - Ising models, Latent Dirichlet Allocation, and factor graphs from the UAI Inference Competition. Selective-SPNs give a better lower bound than mean-field and structured mean-field, and is competitive with approximations that do not provide a lower bound, such as Loopy Belief Propagation and Tree-Reweighted Belief Propagation. Our results show that probabilistic circuits are promising tools for variational inference in discrete graphical models as they combine tractability and expressivity.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Probabilistic Variational Bounds for Graphical Models
    Liu, Qiang
    Fisher, John, III
    Ihler, Alexander
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [2] Statistical inference with probabilistic graphical models
    Shah, Devavrat
    [J]. STATISTICAL PHYSICS, OPTIMIZATION, INFERENCE, AND MESSAGE-PASSING ALGORITHMS, 2016, : 1 - 27
  • [3] Fast Inference for Probabilistic Graphical Models
    Jiang, Jiantong
    Wen, Zeyi
    Mansoor, Atif
    Mian, Ajmal
    [J]. PROCEEDINGS OF THE 2024 USENIX ANNUAL TECHNICAL CONFERENCE, ATC 2024, 2024, : 95 - 110
  • [4] A Discrete Regularization for Probabilistic Graphical Models
    Kriukova, Galyna
    [J]. 7TH INTERNATIONAL EURASIAN CONFERENCE ON MATHEMATICAL SCIENCES AND APPLICATIONS (IECMSA-2018), 2018, 2037
  • [5] Quantum circuits for discrete graphical models
    Piatkowski, Nico
    Zoufal, Christa
    [J]. QUANTUM MACHINE INTELLIGENCE, 2024, 6 (02)
  • [6] Graphical Models, Exponential Families, and Variational Inference
    Wainwright, Martin J.
    Jordan, Michael I.
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2008, 1 (1-2): : 1 - 305
  • [7] Simulation of graphical models for multiagent probabilistic inference
    Xiang, Y
    An, X
    Cercone, N
    [J]. SIMULATION-TRANSACTIONS OF THE SOCIETY FOR MODELING AND SIMULATION INTERNATIONAL, 2003, 79 (10): : 545 - 567
  • [8] Lifted Probabilistic Inference for Asymmetric Graphical Models
    Van den Broeck, Guy
    Niepert, Mathias
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 3599 - 3605
  • [9] Variational Inference in Mixed Probabilistic Submodular Models
    Djolonga, Josip
    Tschiatschek, Sebastian
    Krause, Andreas
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [10] Neural Variational Inference and Learning in Undirected Graphical Models
    Kuleshov, Volodymyr
    Ermon, Stefano
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30