Stochastic Gradient Descent with Noise of Machine Learning Type Part I: Discrete Time Analysis

被引:0
|
作者
Wojtowytsch, Stephan [1 ]
机构
[1] Texas A&M Univ, Dept Math, 155 Ireland St, College Stn, TX 77840 USA
关键词
Stochastic gradient descent; Almost sure convergence; Lojasiewicz inequality; Non-convex optimization; Machine learning; deep learning; Overparametrization; Global minimum selection;
D O I
10.1007/s00332-023-09903-3
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Stochastic gradient descent (SGD) is one of the most popular algorithms in modern machine learning. The noise encountered in these applications is different from that in many theoretical analyses of stochastic gradient algorithms. In this article, we discuss some of the common properties of energy landscapes and stochastic noise encountered in machine learning problems, and how they affect SGD-based optimization. In particular, we show that the learning rate in SGD with machine learning noise can be chosen to be small, but uniformly positive for all times if the energy landscape resembles that of overparametrized deep learning problems. If the objective function satisfies a Lojasiewicz inequality, SGD converges to the global minimum exponentially fast, and even for functions which may have local minima, we establish almost sure convergence to the global minimum at an exponential rate from any finite energy initialization. The assumptions that we make in this result concern the behavior where the objective function is either small or large and the nature of the gradient noise, but the energy landscape is fairly unconstrained on the domain where the objective function takes values in an intermediate regime.
引用
收藏
页数:52
相关论文
共 50 条
  • [1] Stochastic Gradient Descent with Noise of Machine Learning Type Part I: Discrete Time Analysis
    Stephan Wojtowytsch
    [J]. Journal of Nonlinear Science, 2023, 33
  • [2] Stochastic Gradient Descent with Noise of Machine Learning Type Part II: Continuous Time Analysis
    Wojtowytsch S.
    [J]. Journal of Nonlinear Science, 2024, 34 (1)
  • [3] Stochastic Gradient Descent and Its Variants in Machine Learning
    Netrapalli, Praneeth
    [J]. JOURNAL OF THE INDIAN INSTITUTE OF SCIENCE, 2019, 99 (02) : 201 - 213
  • [4] Stochastic Gradient Descent and Its Variants in Machine Learning
    Praneeth Netrapalli
    [J]. Journal of the Indian Institute of Science, 2019, 99 : 201 - 213
  • [5] Noise and Fluctuation of Finite Learning Rate Stochastic Gradient Descent
    Liu, Kangqiao
    Liu Ziyin
    Ueda, Masahito
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [6] Analysis of stochastic gradient descent in continuous time
    Jonas Latz
    [J]. Statistics and Computing, 2021, 31
  • [7] Analysis of stochastic gradient descent in continuous time
    Latz, Jonas
    [J]. STATISTICS AND COMPUTING, 2021, 31 (04)
  • [8] Large-Scale Machine Learning with Stochastic Gradient Descent
    Bottou, Leon
    [J]. COMPSTAT'2010: 19TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL STATISTICS, 2010, : 177 - 186
  • [9] First Exit Time Analysis of Stochastic Gradient Descent Under Heavy-Tailed Gradient Noise
    Thanh Huy Nguyen
    Simsekli, Umut
    Gurbuzbalaban, Mert
    Richard, Gael
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] The effective noise of stochastic gradient descent
    Mignacco, Francesca
    Urbani, Pierfrancesco
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (08):