Analysis of a Two-Step Gradient Method with Two Momentum Parameters for Strongly Convex Unconstrained Optimization

被引:0
|
作者
Krivovichev, Gerasim V. [1 ]
Sergeeva, Valentina Yu. [1 ]
机构
[1] St Petersburg State Univ, Fac Appl Math & Control Proc, 7-9 Univ Skaya Nab, St Petersburg 199034, Russia
关键词
convex optimization; gradient descent; heavy ball method; CONVERGENCE-RATES; ALGORITHMS; SYSTEM;
D O I
10.3390/a17030126
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper is devoted to the theoretical and numerical analysis of the two-step method, constructed as a modification of Polyak's heavy ball method with the inclusion of an additional momentum parameter. For the quadratic case, the convergence conditions are obtained with the use of the first Lyapunov method. For the non-quadratic case, sufficiently smooth strongly convex functions are obtained, and these conditions guarantee local convergence.An approach to finding optimal parameter values based on the solution of a constrained optimization problem is proposed. The effect of an additional parameter on the convergence rate is analyzed. With the use of an ordinary differential equation, equivalent to the method, the damping effect of this parameter on the oscillations, which is typical for the non-monotonic convergence of the heavy ball method, is demonstrated. In different numerical examples for non-quadratic convex and non-convex test functions and machine learning problems (regularized smoothed elastic net regression, logistic regression, and recurrent neural network training), the positive influence of an additional parameter value on the convergence process is demonstrated.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Two-step conjugate gradient method for unconstrained optimization
    R. Dehghani
    N. Bidabadi
    [J]. Computational and Applied Mathematics, 2020, 39
  • [2] Two-step conjugate gradient method for unconstrained optimization
    Dehghani, R.
    Bidabadi, N.
    [J]. COMPUTATIONAL & APPLIED MATHEMATICS, 2020, 39 (03):
  • [3] A two-step improved Newton method to solve convex unconstrained optimization problems
    T. Dehghan Niri
    S. A. Shahzadeh Fazeli
    M. Heydari
    [J]. Journal of Applied Mathematics and Computing, 2020, 62 : 37 - 53
  • [4] A two-step improved Newton method to solve convex unconstrained optimization problems
    Niri, T. Dehghan
    Fazeli, S. A. Shahzadeh
    Heydari, M.
    [J]. JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2020, 62 (1-2) : 37 - 53
  • [5] A new two-step gradient-type method for large-scale unconstrained optimization
    Farid, Mahboubeh
    Leong, Wah June
    Abu Hassan, Malik
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2010, 59 (10) : 3301 - 3307
  • [6] A novel sparse array synthesis method based on two-step convex optimization
    Sun, Bin
    Xing, Shiqi
    Zhang, Jingke
    Li, Fei
    Li, Yongzhen
    Wang, Xuesong
    [J]. 2016 10TH EUROPEAN CONFERENCE ON ANTENNAS AND PROPAGATION (EUCAP), 2016,
  • [7] The effect of iteration parameters on the convergence of a two-step method
    Nagaeva E.I.
    Seregina O.I.
    [J]. Computational Mathematics and Modeling, 2002, 13 (1) : 66 - 74
  • [8] Two-step active contour method based on gradient flow
    Zhu, Linlin
    Fan, Baojie
    Tang, Yandong
    [J]. INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2010, 37 (04): : 364 - 371
  • [9] Delayed Weighted Gradient Method with simultaneous step-sizes for strongly convex optimization
    Lara, Hugo
    Aleixo, Rafael
    Oviedo, Harry
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 89 (01) : 151 - 182
  • [10] Enhanced Two-Step Satisfactory Method for Multi-Objective Optimization with Fuzzy Parameters
    Hu, Chaofang
    Liu, Qizhi
    Liu, Yanwen
    [J]. INTELLIGENT COMPUTING FOR SUSTAINABLE ENERGY AND ENVIRONMENT, 2013, 355 : 122 - 129