Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?

被引:0
|
作者
Kim, Kyurae [1 ]
Ma, Yi-An [2 ]
Gardner, Jacob R. [1 ]
机构
[1] Univ Penn, Philadelphia, PA 19104 USA
[2] Univ Calif San Diego, La Jolla, CA USA
关键词
INFORMATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We prove that black-box variational inference (BBVI) with control variates, particularly the sticking-the-landing (STL) estimator, converges at a geometric (traditionally called "linear") rate under perfect variational family specification. In particular, we prove a quadratic bound on the gradient variance of the STL estimator, one which encompasses misspecified variational families. Combined with previous works on the quadratic variance condition, this directly implies convergence of BBVI with the use of projected stochastic gradient descent. For the projection operator, we consider a domain with triangular scale matrices, which the projection onto is computable in Theta(d) time, where d is the dimensionality of the target posterior. We also improve existing analysis on the regular closed-form entropy gradient estimators, which enables comparison against the STL estimator, providing explicit non-asymptotic complexity guarantees for both.
引用
收藏
页数:50
相关论文
共 50 条
  • [1] On the Convergence of Black-Box Variational Inference
    Kim, Kyurae
    Oh, Jisu
    Wu, Kaiwen
    Ma, Yi-An
    Gardner, Jacob R.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Provable convergence guarantees for black-box variational inference
    Domke, Justin
    Garrigos, Guillaume
    Gower, Robert
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] Provable Smoothness Guarantees for Black-Box Variational Inference
    Domke, Justin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [4] A Framework for Improving the Reliability of Black-box Variational Inference
    Welandawe, Manushi
    Andersen, Michael Riis
    Vehtari, Aki
    Huggins, Jonathan H.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [5] Provable Smoothness Guarantees for Black-Box Variational Inference
    Domke, Justin
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [6] Black-Box Variational Inference as Distilled Langevin Dynamics
    Hoffman, Matthew
    Ma, Yi-An
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [7] Black-Box Variational Inference for Stochastic Differential Equations
    Ryder, Thomas
    Golightly, Andrew
    McGough, A. Stephen
    Prangle, Dennis
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] Marginalized Stochastic Natural Gradients for Black-Box Variational Inference
    Ji, Geng
    Sujono, Debora
    Sudderth, Erik B.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] Provable Gradient Variance Guarantees for Black-Box Variational Inference
    Domke, Justin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Joint control variate for faster black-box variational inference
    Wang, Xi
    Geffner, Tomas
    Domke, Justin
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238