A comparison of algorithms for inference and learning in probabilistic graphical models

被引:98
|
作者
Frey, BJ
Jojic, N
机构
[1] Univ Toronto, Dept Elect & Comp Engn, Toronto, ON M5S 3G4, Canada
[2] Microsoft Corp, Machine Learning & Appl Stat Grp, Redmond, WA 98052 USA
关键词
graphical models; Bayesian networks; probability models; probabilistic inference; reasoning; learning; Bayesian methods; variational techniques; sum-product algorithm; loopy belief propagation; EM algorithm; mean field; Gibbs sampling; free energy; Gibbs free energy; Bethe free energy;
D O I
10.1109/TPAMI.2005.169
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.
引用
收藏
页码:1392 / 1416
页数:25
相关论文
共 50 条
  • [1] Fast Inference for Probabilistic Graphical Models
    Jiang, Jiantong
    Wen, Zeyi
    Mansoor, Atif
    Mian, Ajmal
    [J]. PROCEEDINGS OF THE 2024 USENIX ANNUAL TECHNICAL CONFERENCE, ATC 2024, 2024, : 95 - 110
  • [2] Statistical inference with probabilistic graphical models
    Shah, Devavrat
    [J]. STATISTICAL PHYSICS, OPTIMIZATION, INFERENCE, AND MESSAGE-PASSING ALGORITHMS, 2016, : 1 - 27
  • [3] Simulation of graphical models for multiagent probabilistic inference
    Xiang, Y
    An, X
    Cercone, N
    [J]. SIMULATION-TRANSACTIONS OF THE SOCIETY FOR MODELING AND SIMULATION INTERNATIONAL, 2003, 79 (10): : 545 - 567
  • [4] Lifted Probabilistic Inference for Asymmetric Graphical Models
    Van den Broeck, Guy
    Niepert, Mathias
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 3599 - 3605
  • [5] Improving probabilistic inference in graphical models with determinism and cycles
    Ibrahim, Mohamed-Hamza
    Pal, Christopher
    Pesant, Gilles
    [J]. MACHINE LEARNING, 2017, 106 (01) : 1 - 54
  • [6] Inference in Probabilistic Graphical Models by Graph Neural Networks
    Yoon, KiJung
    Liao, Renjie
    Xiong, Yuwen
    Zhang, Lisa
    Fetaya, Ethan
    Urtasun, Raquel
    Zemel, Richard
    Pitkow, Xaq
    [J]. CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 868 - 875
  • [7] Improving probabilistic inference in graphical models with determinism and cycles
    Mohamed-Hamza Ibrahim
    Christopher Pal
    Gilles Pesant
    [J]. Machine Learning, 2017, 106 : 1 - 54
  • [8] Probabilistic Circuits for Variational Inference in Discrete Graphical Models
    Shih, Andy
    Ermon, Stefano
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [9] All-Optical Graphical Models for Probabilistic Inference
    Blanche, Pierre-Alexandre
    Glick, Madeleine
    Wissinger, John
    Kieu, Khanh
    Babaeian, Masoud
    Rastegarfar, Houman
    Demir, Veysi
    Akbulut, Mehmetcan
    Keiffer, Patrick
    Norwood, Robert A.
    Peyghambarian, Nasser
    Neifeld, Mark
    [J]. 2016 IEEE PHOTONICS SOCIETY SUMMER TOPICAL MEETING SERIES (SUM), 2016, : 199 - 200
  • [10] Linear response algorithms for approximate inference in graphical models
    Welling, M
    Teh, YW
    [J]. NEURAL COMPUTATION, 2004, 16 (01) : 197 - 221