Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization

被引:0
|
作者
Szilárd Csaba László
机构
[1] Technical University of Cluj-Napoca,Department of Mathematics
来源
Mathematical Programming | 2021年 / 190卷
关键词
inertial algorithm; Non-convex optimization; Kurdyka–Łojasiewicz inequality; Convergence rate; Łojasiewicz exponent; 90C26; 90C30; 65K10;
D O I
暂无
中图分类号
学科分类号
摘要
We investigate an inertial algorithm of gradient type in connection with the minimization of a non-convex differentiable function. The algorithm is formulated in the spirit of Nesterov’s accelerated convex gradient method. We prove some abstract convergence results which applied to our numerical scheme allow us to show that the generated sequences converge to a critical point of the objective function, provided a regularization of the objective function satisfies the Kurdyka–Łojasiewicz property. Further, we obtain convergence rates for the generated sequences and the objective function values formulated in terms of the Łojasiewicz exponent of a regularization of the objective function. Finally, some numerical experiments are presented in order to compare our numerical scheme and some algorithms well known in the literature.
引用
收藏
页码:285 / 329
页数:44
相关论文
共 50 条
  • [41] A gradient-type algorithm with backward inertial steps associated to a nonconvex minimization problem
    Cristian Daniel Alecsa
    Szilárd Csaba László
    Adrian Viorel
    Numerical Algorithms, 2020, 84 : 485 - 512
  • [42] Global Convergence of a Modified Limited Memory BFGS Method for Non-convex Minimization
    Xiao, Yun-hai
    Li, Ting-feng
    Wei, Zeng-xin
    ACTA MATHEMATICAE APPLICATAE SINICA-ENGLISH SERIES, 2013, 29 (03): : 555 - 566
  • [43] THE UNCONDITIONAL MINIMIZATION OF NON-CONVEX FUNCTIONS
    BEREZNEV, VA
    KARMANOV, VG
    TRETYAKOV, AA
    USSR COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 1987, 27 (11-12): : 101 - 104
  • [44] Unadjusted Langevin Algorithm for Non-convex Weakly Smooth Potentials
    Nguyen, Dao
    Dang, Xin
    Chen, Yixin
    COMMUNICATIONS IN MATHEMATICS AND STATISTICS, 2023,
  • [45] Generalized Bregman distances and convergence rates for non-convex regularization methods
    Grasmair, Markus
    INVERSE PROBLEMS, 2010, 26 (11)
  • [46] Simple Stochastic Gradient Methods for Non-Smooth Non-Convex Regularized Optimization
    Metel, Michael R.
    Takeda, Akiko
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [47] Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization
    Yuan, Kun
    Huang, Xinmeng
    Chen, Yiming
    Zhang, Xiaohan
    Zhang, Yingya
    Pan, Pan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [48] Global Convergence of Stochastic Gradient Descent for Some Non-convex Matrix Problems
    De Sa, Christopher
    Olukotun, Kunle
    Re, Christopher
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 2332 - 2341
  • [49] On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems
    Bo Wen
    Xiaoping Xue
    Journal of Global Optimization, 2019, 75 : 767 - 787
  • [50] On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems
    Wen, Bo
    Xue, Xiaoping
    JOURNAL OF GLOBAL OPTIMIZATION, 2019, 75 (03) : 767 - 787