Theory-guided Message Passing Neural Network for Probabilistic Inference

被引:0
|
作者
Cui, Zijun [1 ,2 ]
Wang, Hanjing [1 ]
Gao, Tian [3 ]
Talamadupula, Kartik [3 ,4 ]
Ji, Qiang [1 ]
机构
[1] Rensselaer Polytech Inst, Troy, NY 12181 USA
[2] Univ Southern Calif, Los Angeles, CA 90007 USA
[3] IBM Res, Armonk, NY USA
[4] Symbl Ai, Seattle, WA USA
来源
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238 | 2024年 / 238卷
关键词
BELIEF PROPAGATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Probabilistic inference can be tackled by minimizing a variational free energy through message passing. To improve performance, neural networks are adopted for message computation. Neural message learning is heuristic and requires strong guidance to perform well. In this work, we propose a theory-guided message passing neural network (TMPNN) for probabilistic inference. Inspired by existing work, we consider a generalized Bethe free energy which allows for a learnable variational assumption. Instead of using a black-box neural network for message computation, we utilize a general message equation and introduce a symbolic message function with semantically meaningful parameters. The analytically derived symbolic message function is seamlessly integrated into the MPNN framework, giving rise to the proposed TMPNN. TMPNN is trained using algorithmic supervision without requiring exact inference results. Leveraging the theory-guided symbolic function, TMPNN offers strengthened theoretical guarantees compared to conventional heuristic neural models. It presents a novel contribution by demonstrating its applicability to both MAP and marginal inference tasks, outperforming SOTAs in both cases. Furthermore, TMPNN provides improved generalizability across various graph structures and enhanced data efficiency.
引用
收藏
页数:25
相关论文
共 50 条
  • [21] Search for rogue waves in Bose-Einstein condensates via a theory-guided neural network
    Bai, Xiao-Dong
    Zhang, Dongxiao
    PHYSICAL REVIEW E, 2022, 106 (02)
  • [22] A learning probabilistic neural network with fuzzy inference
    Bodyanskiy, Y
    Gorshkov, Y
    Kolodyazhniy, V
    Wermstedt, J
    ARTIFICIAL NEURAL NETS AND GENETIC ALGORITHMS, PROCEEDINGS, 2003, : 13 - 17
  • [23] Theory-guided deep neural network for boiler 3-D NOx concentration distribution prediction
    Tang, Zhenhao
    Sui, Mengxuan
    Wang, Xu
    Xue, Wenyuan
    Yang, Yuan
    Wang, Zhi
    Ouyang, Tinghui
    ENERGY, 2024, 299
  • [24] Nursing Theory-Guided Research
    Alligood, Martha Raile
    NURSING SCIENCE QUARTERLY, 2011, 24 (03) : 195 - 196
  • [25] Probabilistic Message-Passing Control
    Herzallah, Randa
    Lowe, David
    Qarout, Yazan
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2022, 52 (07): : 4470 - 4482
  • [26] A novel message passing neural network based on neighborhood expansion
    Xue, Yanfeng
    Jin, Zhen
    Apasiba, Abeo Timothy
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (03) : 849 - 860
  • [27] Equivariant Message Passing Neural Network for Crystal Material Discovery
    Klipfel, Astrid
    Bouraoui, Zied
    Peltre, Olivier
    Fregier, Yael
    Harrati, Najwa
    Sayede, Adlane
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 12, 2023, : 14304 - 14311
  • [28] Explicit Message-Passing Heterogeneous Graph Neural Network
    Xu, Lei
    He, Zhen-Yu
    Wang, Kai
    Wang, Chang-Dong
    Huang, Shu-Qiang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (07) : 6916 - 6929
  • [29] BeMap: Balanced Message Passing for Fair Graph Neural Network
    Lin, Xiao
    Kang, Jian
    Cong, Weilin
    Tong, Hanghang
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [30] Domain-adaptive message passing graph neural network
    Shen, Xiao
    Pan, Shirui
    Choi, Kup-Sze
    Zhou, Xi
    NEURAL NETWORKS, 2023, 164 : 439 - 454