Variational Message Passing Neural Network for Maximum-A-Posteriori (MAP) Inference

被引:0
|
作者
Cui, Zijun [1 ]
Wang, Hanjing [1 ]
Gao, Tian [2 ]
Talamadupula, Kartik [2 ]
Ji, Qiang [1 ]
机构
[1] Rensselaer Polytech Inst, ECSE, Troy, NY 12181 USA
[2] IBM Res, Armonk, NY USA
关键词
BELIEF PROPAGATION; RELAXATIONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Maximum-A-Posteriori (MAP) inference is a fundamental task in probabilistic inference and belief propagation (BP) is a widely used algorithm for MAP inference. Though BP has been applied successfully to many different fields, it offers no performance guarantee and often performs poorly on loopy graphs. To improve the performance on loopy graphs and to scale up to large graphs, we propose a variational message passing neural network (V-MPNN), where we leverage both the power of neural networks in modeling complex functions and the well-established algorithmic theories on variational belief propagation. Instead of relying on a hand-crafted variational assumption, we propose a neural-augmented free energy where a general variational distribution is parameterized through a neural network. A message passing neural network is utilized for the minimization of neural-augmented free energy. Training of the MPNN is thus guided by neural-augmented free energy, without requiring exact MAP configurations as annotations. We empirically demonstrate the effectiveness of the proposed V-MPNN by comparing against both state-of-the-art training-free methods and training-based methods.
引用
收藏
页码:464 / 474
页数:11
相关论文
共 50 条
  • [1] Two Reformulation Approaches to Maximum-A-Posteriori Inference in Sum-Product Networks
    Maua, Denis Deratani
    Ribeiro, Heitor Reis
    Katague, Gustavo Perez
    Antonucci, Alessandro
    INTERNATIONAL CONFERENCE ON PROBABILISTIC GRAPHICAL MODELS, VOL 138, 2020, 138 : 293 - 304
  • [2] Theory-guided Message Passing Neural Network for Probabilistic Inference
    Cui, Zijun
    Wang, Hanjing
    Gao, Tian
    Talamadupula, Kartik
    Ji, Qiang
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [4] Gene-network inference by message passing
    Braunstein, A.
    Pagnani, A.
    Weigt, M.
    Zecchina, R.
    INTERNATIONAL WORKSHOP ON STATISTICAL-MECHANICAL INFORMATICS 2007 (IW-SMI 2007), 2008, 95 : U168 - U178
  • [5] Extended Variational Message Passing for Automated Approximate Bayesian Inference
    Akbayrak, Semih
    Bocharov, Ivan
    de Vries, Bert
    ENTROPY, 2021, 23 (07)
  • [6] Accelerated Message Passing for Entropy-Regularized MAP Inference
    Lee, Jonathan N.
    Pacchiano, Aldo
    Bartlett, Peter
    Jordan, Michael, I
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [7] Neural Relational Inference with Efficient Message Passing Mechanisms
    Chen, Siyuan
    Wang, Jiahai
    Li, Guoqing
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7055 - 7063
  • [8] Message Passing Neural Network Versus Message Passing Algorithm for Cooperative Positioning
    Tedeschini, Bernardo Camajori
    Brambilla, Mattia
    Nicoli, Monica
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2023, 9 (06) : 1666 - 1676
  • [9] Graphical stochastic models for tracking applications with variational message passing inference
    Trusheim, Felix
    Condurache, Alexandru
    Mertins, Alfred
    2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,
  • [10] Cross Message Passing Graph Neural Network
    Zhang, Zeyu
    Liu, Zheng
    Zhou, Qiyun
    Qu, Yanwen
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,