Improved Accelerated Gradient Algorithms with Line Search for Smooth Convex Optimization Problems

被引:0
|
作者
Li, Ting [1 ]
Song, Yongzhong [1 ]
Cai, Xingju [1 ]
机构
[1] Nanjing Normal Univ, Sch Math Sci, Jiangsu Key Lab NSLSCS, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Adaptive step size; Nesterov acceleration algorithm; accelerated gradient algorithm; Lyapunov function; l(2) regularized logistic regression; VARIATIONAL-INEQUALITIES; CONVERGENCE RATE; PROJECTION;
D O I
10.1142/S0217595923500306
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
For smooth convex optimization problems, the optimal convergence rate of first-order algorithm is O(1/k(2)) in theory. This paper proposes three improved accelerated gradient algorithms with the gradient information at the latest point. For the step size, to avoid using the global Lipschitz constant and make the algorithm converge faster, new adaptive line search strategies are adopted. By constructing a descent Lyapunov function, we prove that the proposed algorithms can preserve the convergence rate of O(1/k(2)). Numerical experiments demonstrate that our algorithms perform better than some existing algorithms which have optimal convergence rate.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Line search fixed point algorithms based on nonlinear conjugate gradient directions: application to constrained smooth convex optimization
    Iiduka H.
    [J]. Fixed Point Theory and Applications, 2016 (1)
  • [2] The Barzilai and Borwein Gradient Method with Nonmonotone Line Search for Nonsmooth Convex Optimization Problems
    Yuan, Gonglin
    Wei, Zengxin
    [J]. MATHEMATICAL MODELLING AND ANALYSIS, 2012, 17 (02) : 203 - 216
  • [3] Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
    Hanzely, Filip
    Richtarik, Peter
    Xiao, Lin
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (02) : 405 - 440
  • [4] Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
    Filip Hanzely
    Peter Richtárik
    Lin Xiao
    [J]. Computational Optimization and Applications, 2021, 79 : 405 - 440
  • [5] CONVEX SYNTHESIS OF ACCELERATED GRADIENT ALGORITHMS
    Scherer, Carsten
    Ebenbauer, Christian
    [J]. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 2021, 59 (06) : 4615 - 4645
  • [6] A modified PRP conjugate gradient algorithm with nonmonotone line search for nonsmooth convex optimization problems
    Yuan G.
    Wei Z.
    [J]. Wei, Zengxin (zxwei@gxu.edu.cn), 1600, Springer Verlag (51): : 397 - 412
  • [7] Conditional gradient algorithms for norm-regularized smooth convex optimization
    Harchaoui, Zaid
    Juditsky, Anatoli
    Nemirovski, Arkadi
    [J]. MATHEMATICAL PROGRAMMING, 2015, 152 (1-2) : 75 - 112
  • [8] Conditional gradient algorithms for norm-regularized smooth convex optimization
    Zaid Harchaoui
    Anatoli Juditsky
    Arkadi Nemirovski
    [J]. Mathematical Programming, 2015, 152 : 75 - 112
  • [9] Accelerated Projected Gradient Algorithms for Sparsity Constrained Optimization Problems
    Alcantara, Jan Harold
    Lee, Ching-pei
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [10] An Accelerated Convex Optimization Algorithm with Line Search and Applications in Machine Learning
    Chumpungam, Dawan
    Sarnmeta, Panitarn
    Suantai, Suthep
    [J]. MATHEMATICS, 2022, 10 (09)