Functional Central Limit Theorem and Strong Law of Large Numbers for Stochastic Gradient Langevin Dynamics

被引:0
|
作者
Lovas, A. [1 ,2 ]
Rasonyi, M. [1 ,3 ]
机构
[1] Alfred Renyi Inst Math, Budapest, Hungary
[2] Budapest Univ Technol & Econ, Budapest, Hungary
[3] Eotvos Lorand Univ, Budapest, Hungary
来源
APPLIED MATHEMATICS AND OPTIMIZATION | 2023年 / 88卷 / 03期
关键词
Stochastic gradient descent; Online learning; Functional central limit theorem; Mixing; Markov chains in random environments; DEPENDENT DATA STREAMS; APPROXIMATION;
D O I
10.1007/s00245-023-10052-y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We study the mixing properties of an important optimization algorithm of machine learning: the stochastic gradient Langevin dynamics (SGLD) with a fixed step size. The data stream is not assumed to be independent hence the SGLD is not a Markov chain, merely a Markov chain in a random environment, which complicates themathematical treatment considerably. We derive a strong law of large numbers and a functional central limit theorem for SGLD.
引用
收藏
页数:22
相关论文
共 50 条