Riemannian Stein Variational Gradient Descent for Bayesian Inference

被引:0
|
作者
Liu, Chang [1 ]
Zhu, Jun [1 ]
机构
[1] Tsinghua Univ, Ctr Bioinspired Comp Res, Dept Comp Sci & Tech, TNList Lab,State Key Lab Intell Tech & Syst, Beijing, Peoples R China
来源
THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2018年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We develop Riemannian Stein Variational Gradient Descent (RSVGD), a Bayesian inference method that generalizes Stein Variational Gradient Descent (SVGD) to Riemann manifold. The benefits are two-folds: (i) for inference tasks in Euclidean spaces, RSVGD has the advantage over SVGD of utilizing information geometry, and (ii) for inference tasks on Riemann manifolds, RSVGD brings the unique advantages of SVGD to the Riemannian world. To appropriately transfer to Riemann manifolds, we conceive novel and non-trivial techniques for RSVGD, which are required by the intrinsically different characteristics of general Riemann manifolds from Euclidean spaces. We also discover Riemannian Stein's Identity and Riemannian Kernelized Stein Discrepancy. Experimental results show the advantages over SVGD of exploring distribution geometry and the advantages of particleefficiency, iteration-effectiveness and approximation flexibility over other inference methods on Riemann manifolds.
引用
收藏
页码:3627 / 3634
页数:8
相关论文
共 50 条
  • [21] Non-Gaussian Parameter Inference for Hydrogeological Models Using Stein Variational Gradient Descent
    Ramgraber, Maximilian
    Weatherl, Robin
    Blumensaat, Frank
    Schirmer, Mario
    WATER RESOURCES RESEARCH, 2021, 57 (04)
  • [22] VAE Learning via Stein Variational Gradient Descent
    Pu, Yunchen
    Gan, Zhe
    Henao, Ricardo
    Li, Chunyuan
    Han, Shaobo
    Carin, Lawrence
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [23] Gradient-free Stein variational gradient descent with kernel approximation
    Yan, Liang
    Zou, Xiling
    APPLIED MATHEMATICS LETTERS, 2021, 121 (121)
  • [25] Stochastic Gradient Descent as Approximate Bayesian Inference
    Mandt, Stephan
    Hoffman, Matthew D.
    Blei, David M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [26] A STOCHASTIC VERSION OF STEIN VARIATIONAL GRADIENT DESCENT FOR EFFICIENT SAMPLING
    Li, Lei
    Li, Yingzhou
    Liu, Jian-Guo
    Liu, Zibu
    Lu, Jianfeng
    COMMUNICATIONS IN APPLIED MATHEMATICS AND COMPUTATIONAL SCIENCE, 2020, 15 (01) : 37 - 63
  • [27] Learning to Draw Samples with Amortized Stein Variational Gradient Descent
    Feng, Yihao
    Wang, Dilin
    Liu, Qiang
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,
  • [28] Density Estimation-Based Stein Variational Gradient Descent
    Kim, Jeongho
    Lee, Byungjoon
    Min, Chohong
    Park, Jaewoo
    Ryu, Keunkwan
    COGNITIVE COMPUTATION, 2025, 17 (01)
  • [29] Stein Variational Gradient Descent with Matrix-Valued Kernels
    Wang, Dilin
    Tang, Ziyang
    Bajaj, Chandrajit
    Liu, Qiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [30] Unsupervised Anomaly Detection & Diagnosis: A Stein Variational Gradient Descent Approach
    Chen, Zhichao
    Ding, Leilei
    Huang, Jianmin
    Chu, Zhixuan
    Dai, Qingyang
    Wang, Hao
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3783 - 3787