New stepsizes for the gradient method

被引:6
|
作者
Sun, Cong [1 ]
Liu, Jin-Peng [2 ,3 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Sci, Beijing 100876, Peoples R China
[2] Beihang Univ, Sch Math & Syst Sci, Beijing, Peoples R China
[3] Univ Maryland, Dept Math, College Pk, MD 20742 USA
关键词
Gradient method; Steepest descent; R-linear convergence rate; Finite termination; STEEPEST DESCENT; BARZILAI; BEHAVIOR;
D O I
10.1007/s11590-019-01512-y
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Gradient methods are famous for their simplicity and low complexity, which attract more and more attention for large scale optimization problems. A good stepsize plays an important role to construct an efficient gradient method. This paper proposes a new framework to generate stepsizes for gradient methods applied to convex quadratic function minimization problems. By adopting different criterions, we propose four new gradient methods. For 2-dimensional unconstrained problems with convex quadratic objective functions, we prove that the new methods either terminate in finite iterations or converge R-superlinearly; for n-dimensional problems, we prove that all the new methods converge R-linearly. Numerical experiments show that the new methods enjoy lower complexity and outperform the existing gradient methods.
引用
收藏
页码:1943 / 1955
页数:13
相关论文
共 50 条
  • [1] New stepsizes for the gradient method
    Cong Sun
    Jin-Peng Liu
    [J]. Optimization Letters, 2020, 14 : 1943 - 1955
  • [2] New gradient methods with adaptive stepsizes by approximate models
    Liu, Zexian
    Liu, Hongwei
    Wang, Ting
    [J]. OPTIMIZATION, 2024, 73 (09) : 2987 - 3014
  • [3] On the Convergence of Stochastic Gradient Descent with Adaptive Stepsizes
    Li, Xiaoyu
    Orabona, Francesco
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [4] An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization
    Liu, Zexian
    Chu, Wangli
    Liu, Hongwei
    [J]. RAIRO-OPERATIONS RESEARCH, 2022, 56 (04) : 2403 - 2424
  • [5] Convergence of conjugate gradient methods with constant stepsizes
    Dai, Yu-Hong
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2011, 26 (06): : 895 - 909
  • [6] Decentralized Inexact Proximal Gradient Method With Network-Independent Stepsizes for Convex Composite Optimization
    Guo, Luyao
    Shi, Xinli
    Cao, Jinde
    Wang, Zihao
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 786 - 801
  • [7] Gradient-based adaptive filter with optimum stepsizes
    Wang, HK
    Zhang, LY
    [J]. ICEMI'99: FOURTH INTERNATIONAL CONFERENCE ON ELECTRONIC MEASUREMENT & INSTRUMENTS, VOLS 1 AND 2, CONFERENCE PROCEEDINGS, 1999, : 255 - 260
  • [8] Incremental gradient algorithms with stepsizes bounded away from zero
    Solodov, MV
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 1998, 11 (01) : 23 - 35
  • [9] Incremental gradient algorithms with stepsizes bounded away from zero
    Inst. de Matemat. Pura e Aplicada, Estrada Dona Castorina 110, Rio de Janeiro, RJ 22460-320, Brazil
    [J]. Comput Optim Appl, 1 (23-35):
  • [10] Incremental Gradient Algorithms with Stepsizes Bounded Away from Zero
    M.V. Solodov
    [J]. Computational Optimization and Applications, 1998, 11 : 23 - 35