A comparison of algorithms for inference and learning in probabilistic graphical models

被引:98
|
作者
Frey, BJ
Jojic, N
机构
[1] Univ Toronto, Dept Elect & Comp Engn, Toronto, ON M5S 3G4, Canada
[2] Microsoft Corp, Machine Learning & Appl Stat Grp, Redmond, WA 98052 USA
关键词
graphical models; Bayesian networks; probability models; probabilistic inference; reasoning; learning; Bayesian methods; variational techniques; sum-product algorithm; loopy belief propagation; EM algorithm; mean field; Gibbs sampling; free energy; Gibbs free energy; Bethe free energy;
D O I
10.1109/TPAMI.2005.169
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.
引用
收藏
页码:1392 / 1416
页数:25
相关论文
共 50 条
  • [31] Multiscale Gaussian graphical models and algorithms for large-scale inference
    Choi, Myung Jin
    Willsky, Alan S.
    2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 229 - 233
  • [32] Learning Semantic Models of Data Sources Using Probabilistic Graphical Models
    Binh Vu
    Knoblock, Craig A.
    Pujara, Jay
    WEB CONFERENCE 2019: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2019), 2019, : 1944 - 1953
  • [33] Inference and Verification of Probabilistic Graphical Models from High-Dimensional Data
    Ma, Yinjiao
    Damazyn, Kevin
    Klinger, Jakob
    Gong, Haijun
    DATA INTEGRATION IN THE LIFE SCIENCES, DILS 2015, 2015, 9162 : 223 - 239
  • [34] aGrUM/pyAgrum : a Toolbox to Build Models and Algorithms for Probabilistic Graphical Models in Python']Python
    Ducamp, Gaspard
    Gonzales, Christophe
    Wuillemin, Pierre-Henri
    INTERNATIONAL CONFERENCE ON PROBABILISTIC GRAPHICAL MODELS, VOL 138, 2020, 138
  • [35] Learning of graphical models and efficient inference for object class recognition
    Bergtholdt, Martin
    Kappes, Joerg H.
    Schnoerr, Christoph
    PATTERN RECOGNITION, PROCEEDINGS, 2006, 4174 : 273 - 283
  • [36] Privacy Preserving Distributed Structure Learning of Probabilistic Graphical Models
    Li, Husheng
    2013 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2013, : 188 - 193
  • [37] Tutorial and Survey on Probabilistic Graphical Model and Variational Inference in Deep Reinforcement Learning
    Sun, Xudong
    Bischl, Bernd
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 110 - 119
  • [38] Learning to Generate Posters of Scientific Papers by Probabilistic Graphical Models
    Qiang, Yu-Ting
    Fu, Yan-Wei
    Yu, Xiao
    Guo, Yan-Wen
    Zhou, Zhi-Hua
    Sigal, Leonid
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2019, 34 (01) : 155 - 169
  • [39] Learning to Generate Posters of Scientific Papers by Probabilistic Graphical Models
    Yu-Ting Qiang
    Yan-Wei Fu
    Xiao Yu
    Yan-Wen Guo
    Zhi-Hua Zhou
    Leonid Sigal
    Journal of Computer Science and Technology, 2019, 34 : 155 - 169
  • [40] Probabilistic Graphical Models Parameter Learning with Transferred Prior and Constraints
    Zhou, Yun
    Fenton, Norman
    Hospedales, Timothy M.
    Neil, Martin
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2015, : 972 - 981