Langevin Monte Carlo without smoothness

被引:0
|
作者
Chatterji, Niladri S. [1 ]
Diakonikolas, Jelena [2 ]
Jordan, Michael I. [1 ]
Bartlett, Peter L. [1 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
[2] UW Madison, Madison, WI USA
关键词
ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Langevin Monte Carlo (LMC) is an iterative algorithm used to generate samples from a distribution that is known only up to a normalizing constant. The nonasymptotic dependence of its mixing time on the dimension and target accuracy is understood mainly in the setting of smooth (gradient-Lipschitz) log-densities, a serious limitation for applications in machine learning. In this paper, we remove this limitation, providing polynomial-time convergence guarantees for a variant of LMC in the setting of nonsmooth log-concave distributions. At a high level, our results follow by leveraging the implicit smoothing of the log-density that comes from a small Gaussian perturbation that we add to the iterates of the algorithm and controlling the bias and variance that are induced by this perturbation.
引用
下载
收藏
页码:1716 / 1725
页数:10
相关论文
共 50 条