Stein Variational Message Passing for Continuous Graphical Models

被引:0
|
作者
Wang, Dilin [1 ]
Zeng, Zhe [2 ]
Liu, Qiang [1 ]
机构
[1] Univ Texas Austin, Dept Comp Sci, Austin, TX 78712 USA
[2] Zhejiang Univ, Sch Math Sci, Hangzhou, Zhejiang, Peoples R China
基金
美国国家科学基金会;
关键词
BELIEF PROPAGATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a novel distributed inference algorithm for continuous graphical models, by extending Stein variational gradient descent (SVGD) (Liu & Wang, 2016) to leverage the Markov dependency structure of the distribution of interest. Our approach combines SVGD with a set of structured local kernel functions defined on the Markov blanket of each node, which alleviates the curse of high dimensionality and simultaneously yields a distributed algorithm for decentralized inference tasks. We justify our method with theoretical analysis and show that the use of local kernels can be viewed as a new type of localized approximation that matches the target distribution on the conditional distributions of each node over its Markov blanket. Our empirical results show that our method outperforms a variety of baselines including standard MCMC and particle message passing methods.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Message Passing Stein Variational Gradient Descent
    Zhuo, Jingwei
    Liu, Chang
    Shi, Jiaxin
    Zhu, Jun
    Chen, Ning
    Zhang, Bo
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [2] Graphical stochastic models for tracking applications with variational message passing inference
    Trusheim, Felix
    Condurache, Alexandru
    Mertins, Alfred
    [J]. 2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [3] Message Passing for Collective Graphical Models
    Sun, Tao
    Sheldon, Daniel
    Kumar, Akshat
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 853 - 861
  • [4] Distributed Inference With Variational Message Passing in Gaussian Graphical Models: Tradeoffs in Message Schedules and Convergence Conditions
    Li, Bin
    Wu, Nan
    Wu, Yik-Chung
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 2021 - 2035
  • [5] Lifted Message Passing as Reparametrization of Graphical Models
    Mladenov, Martin
    Globerson, Amir
    Kersting, Kristian
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2014, : 603 - 612
  • [6] Consensus Message Passing for Layered Graphical Models
    Jampani, Varun
    Eslami, S. M. Ali
    Tarlow, Daniel
    Kohli, Pushmeet
    Winn, John
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 425 - 433
  • [7] Feedback Message Passing for Inference in Gaussian Graphical Models
    Liu, Ying
    Chandrasekaran, Venkat
    Anandkumar, Animashree
    Willsky, Alan S.
    [J]. 2010 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2010, : 1683 - 1687
  • [8] Distributed Message Passing for Large Scale Graphical Models
    Schwing, Alexander
    Hazan, Tamir
    Pollefeys, Marc
    Urtasun, Raquel
    [J]. 2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011,
  • [9] Feedback Message Passing for Inference in Gaussian Graphical Models
    Liu, Ying
    Chandrasekaran, Venkat
    Anandkumar, Animashree
    Willsky, Alan S.
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) : 4135 - 4150
  • [10] Variational Message Passing for Elaborate Response Regression Models
    McLean, M. W.
    Wand, M. P.
    [J]. BAYESIAN ANALYSIS, 2019, 14 (02): : 371 - 398