Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions

被引:0
|
作者
Zixuan WANG [1 ]
Shanjian TANG [2 ]
机构
[1] Department of Finance and Control Sciences, Shanghai Center for Mathematical Sciences, Fudan University
[2] Department of Finance and Control Sciences, School of Mathematical Sciences, Fudan University
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is concerned with convergence of stochastic gradient algorithms with momentum terms in the nonconvex setting. A class of stochastic momentum methods, including stochastic gradient descent, heavy ball and Nesterov’s accelerated gradient,is analyzed in a general framework under mild assumptions. Based on the convergence result of expected gradients, the authors prove the almost sure convergence by a detailed discussion of the effects of momentum and the number of upcrossings. It is worth noting that there are not additional restrictions imposed on the objective function and stepsize.Another improvement over previous results is that the existing Lipschitz condition of the gradient is relaxed into the condition of H?lder continuity. As a byproduct, the authors apply a localization procedure to extend the results to stochastic stepsizes.
引用
收藏
页码:445 / 464
页数:20
相关论文
共 50 条
  • [1] Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions
    Wang, Zixuan
    Tang, Shanjian
    [J]. CHINESE ANNALS OF MATHEMATICS SERIES B, 2023, 44 (03) : 445 - 464
  • [2] Convergence of Gradient Algorithms for Nonconvex C1+α Cost Functions
    Zixuan Wang
    Shanjian Tang
    [J]. Chinese Annals of Mathematics, Series B, 2023, 44 : 445 - 464
  • [3] Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
    Yuan, Gonglin
    Wang, Xiaoliang
    Sheng, Zhou
    [J]. NUMERICAL ALGORITHMS, 2020, 84 (03) : 935 - 956
  • [4] Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions
    Gonglin Yuan
    Xiaoliang Wang
    Zhou Sheng
    [J]. Numerical Algorithms, 2020, 84 : 935 - 956
  • [6] A modified scaled conjugate gradient method with global convergence for nonconvex functions
    Babaie-Kafaki, Saman
    Ghanbari, Reza
    [J]. BULLETIN OF THE BELGIAN MATHEMATICAL SOCIETY-SIMON STEVIN, 2014, 21 (03) : 465 - 477
  • [7] AN EXTENDED DAI-LIAO CONJUGATE GRADIENT METHOD WITH GLOBAL CONVERGENCE FOR NONCONVEX FUNCTIONS
    Arazm, Mohammad Reza
    Babaie-Kafaki, Saman
    Ghanbari, Reza
    [J]. GLASNIK MATEMATICKI, 2017, 52 (02) : 361 - 375
  • [8] Preconditioned conjugate gradient algorithms for nonconvex problems
    Pytlak, R
    Tarnawski, T
    [J]. 2004 43RD IEEE CONFERENCE ON DECISION AND CONTROL (CDC), VOLS 1-5, 2004, : 3191 - 3196
  • [9] A gradient-free distributed optimization method for convex sum of nonconvex cost functions
    Pang, Yipeng
    Hu, Guoqiang
    [J]. INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, 2022, 32 (14) : 8086 - 8101
  • [10] A class of stochastic gradient algorithms with exponentiated error cost functions
    Boukis, C.
    Mandic, D. P.
    Constantinides, A. G.
    [J]. DIGITAL SIGNAL PROCESSING, 2009, 19 (02) : 201 - 212