Particle-based energetic variational inference

被引:0
|
作者
Yiwei Wang
Jiuhai Chen
Chun Liu
Lulu Kang
机构
[1] Illinois Institute of Technology,Department of Applied Mathematics
来源
Statistics and Computing | 2021年 / 31卷
关键词
KL-divergence; Energetic variational approach; Gaussian mixture model; Kernel function; Implicit-Euler; Variational inference;
D O I
暂无
中图分类号
学科分类号
摘要
We introduce a new variational inference (VI) framework, called energetic variational inference (EVI). It minimizes the VI objective function based on a prescribed energy-dissipation law. Using the EVI framework, we can derive many existing particle-based variational inference (ParVI) methods, including the popular Stein variational gradient descent (SVGD). More importantly, many new ParVI schemes can be created under this framework. For illustration, we propose a new particle-based EVI scheme, which performs the particle-based approximation of the density first and then uses the approximated density in the variational procedure, or “Approximation-then-Variation” for short. Thanks to this order of approximation and variation, the new scheme can maintain the variational structure at the particle level, and can significantly decrease the KL-divergence in each iteration. Numerical experiments show the proposed method outperforms some existing ParVI methods in terms of fidelity to the target distribution.
引用
收藏
相关论文
共 50 条
  • [1] Particle-based energetic variational inference
    Wang, Yiwei
    Chen, Jiuhai
    Liu, Chun
    Kang, Lulu
    [J]. STATISTICS AND COMPUTING, 2021, 31 (03)
  • [2] Understanding and Accelerating Particle-Based Variational Inference
    Liu, Chang
    Zhuo, Jingwei
    Cheng, Pengyu
    Zhang, Ruiyi
    Zhu, Jun
    Carin, Lawrence
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] Particle-based Variational Inference with Generalized Wasserstein Gradient Flow
    Cheng, Ziheng
    Zhang, Shiyue
    Yu, Longlin
    Zhang, Cheng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] GAD-PVI: A General Accelerated Dynamic-Weight Particle-Based Variational Inference Framework
    Wang, Fangyikang
    Zhu, Huminhao
    Zhang, Chao
    Zhao, Hanbin
    Qian, Hui
    [J]. ENTROPY, 2024, 26 (08)
  • [5] GAD-PVI : A General Accelerated Dynamic-Weight Particle-Based Variational Inference Framework
    Wang, Fangyikang
    Zhu, Huminhao
    Zhang, Chao
    Zhao, Hanbin
    Qian, Hui
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 15466 - 15473
  • [6] A Two-Stage Multiband Delay Estimation Scheme via Stochastic Particle-Based Variational Bayesian Inference
    Hu, Zhixiang
    Liu, An
    Wan, Yubo
    Han, Tony Xiao
    Zhao, Minjian
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 19632 - 19645
  • [7] Particle-based fast jet simulation at the LHC with variational autoencoders
    Touranakou, Mary
    Chernyavskaya, Nadezda
    Duarte, Javier
    Gunopulos, Dimitrios
    Kansal, Raghav
    Orzari, Breno
    Pierini, Maurizio
    Tomei, Thiago
    Vlimant, Jean-Roch
    [J]. MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (03):
  • [8] Two-stage Multiband Wi-Fi Sensing for ISAC via Stochastic Particle-Based Variational Bayesian Inference
    Hu, Zhixiang
    Liu, An
    Wan, Yubo
    Quek, Tony Q. S.
    Zhao, Min-Jian
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5617 - 5622
  • [9] A particle-based variational approach to Bayesian non-negative matrix factorization
    Harvard John A. Paulson School of Engineering and Applied Science, Cambridge
    MA
    02138, United States
    [J]. J. Mach. Learn. Res., 2019,
  • [10] Mirror variational transport: a particle-based algorithm for distributional optimization on constrained domains
    Dai Hai Nguyen
    Tetsuya Sakurai
    [J]. Machine Learning, 2023, 112 : 2845 - 2869