Online Sampling from Log-Concave Distributions

被引:0
|
作者
Lee, Holden [1 ]
Mangoubi, Oren [2 ]
Vishnoi, Nisheeth K. [3 ]
机构
[1] Duke Univ, Durham, NC 27706 USA
[2] Worcester Polytech Inst, Worcester, MA 01609 USA
[3] Yale Univ, New Haven, CT 06520 USA
基金
瑞士国家科学基金会;
关键词
BINARY;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given a sequence of convex functions integral(0), integral(1),...,integral(T) we study the problem of sampling from the Gibbs distribution pi(t) proportional to e(-) Sigma(t)(k=0) f(k) for each epoch t in an online manner. Interest in this problem derives from applications in machine learning, Bayesian statistics, and optimization where, rather than obtaining all the observations at once, one constantly acquires new data, and must continuously update the distribution. Our main result is an algorithm that generates roughly independent samples from pi(t) for every epoch t and, under mild assumptions, makes polylog(T) gradient evaluations per epoch. All previous results imply a bound on the number of gradient or function evaluations which is at least linear in T. Motivated by real-world applications, we assume that functions are smooth, their associated distributions have a bounded second moment, and their minimizer drifts in a bounded manner, but do not assume they are strongly convex. In particular, our assumptions hold for online Bayesian logistic regression, when the data satisfy natural regularity properties, giving a sampling algorithm with updates that are poly-logarithmic in T. In simulations, our algorithm achieves accuracy comparable to an algorithm specialized to logistic regression. Key to our algorithm is a novel stochastic gradient Langevin dynamics Markov chain with a carefully designed variance reduction step and constant batch size. Technically, lack of strong convexity is a significant barrier to analysis and, here, our main contribution is a martingale exit time argument that shows our Markov chain remains in a ball of radius roughly poly-logarithmic in T for enough time to reach within epsilon of pi(t).
引用
收藏
页数:12
相关论文
共 50 条
  • [41] Log-concavity and relative log-concave ordering of compound distributions
    Xia, Wanwan
    Lv, Wenhua
    PROBABILITY IN THE ENGINEERING AND INFORMATIONAL SCIENCES, 2024,
  • [42] MODIFIED LOG-SOBOLEV INEQUALITIES FOR STRONGLY LOG-CONCAVE DISTRIBUTIONS
    Cryan, Mary
    Guo, Heng
    Mousa, Giorgos
    ANNALS OF PROBABILITY, 2021, 49 (01): : 506 - 525
  • [43] Modified log-Sobolev inequalities for strongly log-concave distributions
    Cryan, Mary
    Guo, Heng
    Mousa, Giorgos
    2019 IEEE 60TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS 2019), 2019, : 1358 - 1370
  • [44] Modal linear regression using log-concave distributions
    Kim, Sunyul
    Seo, Byungtae
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (02) : 479 - 494
  • [45] LOG-CONCAVE MEASURES
    Feyel, D.
    Uestuenel, A. S.
    TWMS JOURNAL OF PURE AND APPLIED MATHEMATICS, 2010, 1 (01): : 92 - 105
  • [46] Log-Concave Functions
    Colesanti, Andrea
    CONVEXITY AND CONCENTRATION, 2017, 161 : 487 - 524
  • [47] Transport proofs of weighted Poincare inequalities for, log-concave distributions
    Cordero-Erausquin, Dario
    Gozlan, Nathael
    BERNOULLI, 2017, 23 (01) : 134 - 158
  • [48] Reliable simulation of extremely-truncated log-concave distributions
    di San Miniato, M. Lambardi
    Pagui, E. C. Kenne
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2024,
  • [49] Log-Concave and Multivariate Canonical Noise Distributions for Differential Privacy
    Awan, Jordan A.
    Dong, Jinshuo
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [50] Log-concave sampling: Metropolis-Hastings algorithms are fast
    Dwivedi, Raaz
    Chen, Yuansi
    Wainwright, Martin J.
    Yu, Bin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20