Smoothing Accelerated Proximal Gradient Method with Fast Convergence Rate for Nonsmooth Convex Optimization Beyond Differentiability

被引:1
|
作者
Wu, Fan [1 ]
Bian, Wei [1 ,2 ]
机构
[1] Harbin Inst Technol, Sch Math, Harbin 150001, Peoples R China
[2] Harbin Inst Technol, Inst Adv Study Math, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Nonsmooth optimization; Smoothing method; Accelerated algorithm with extrapolation; Convergence rate; Sequential convergence; MONOTONE-OPERATORS; WEAK-CONVERGENCE; ALGORITHM; MINIMIZATION;
D O I
10.1007/s10957-023-02176-6
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We propose a smoothing accelerated proximal gradient (SAPG) method with fast convergence rate for finding a minimizer of a decomposable nonsmooth convex function over a closed convex set. The proposed algorithm combines the smoothing method with the proximal gradient algorithm with extrapolation (k-1 )/(k+alpha -1 )and alpha > 3. The updating rule of smoothing parameter mu k is a smart scheme and guarantees the global convergence rate of o(ln(sigma) k/k) with sigma is an element of ((1)/(2), 1] on the objective function values. Moreover, we prove that the iterates sequence is convergent to an optimal solution of the problem. We then introduce an error term in the SAPG algorithm to get the inexact smoothing accelerated proximal gradient algorithm. And we obtain the same convergence results as the SAPG algorithm under the summability condition on the errors. Finally, numerical experiments show the effectiveness and efficiency of the proposed algorithm.
引用
收藏
页码:539 / 572
页数:34
相关论文
共 50 条
  • [31] Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
    Hanzely, Filip
    Richtarik, Peter
    Xiao, Lin
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (02) : 405 - 440
  • [32] ON CONVERGENCE ANALYSIS OF DUAL PROXIMAL-GRADIENT METHODS WITH APPROXIMATE GRADIENT FOR A CLASS OF NONSMOOTH CONVEX MINIMIZATION PROBLEMS
    Liu, Sanming
    Wang, Zhijie
    Liu, Chongyang
    [J]. JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2016, 12 (01) : 389 - 402
  • [33] Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
    Filip Hanzely
    Peter Richtárik
    Lin Xiao
    [J]. Computational Optimization and Applications, 2021, 79 : 405 - 440
  • [34] Nesterov Accelerated Shuffling Gradient Method for Convex Optimization
    Tran, Trang H.
    Scheinberg, Katya
    Nguyen, Lam M.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [35] On the convergence properties of the projected gradient method for convex optimization
    Iusem, A. N.
    [J]. COMPUTATIONAL & APPLIED MATHEMATICS, 2003, 22 (01): : 37 - 52
  • [36] A note on the accelerated proximal gradient method for nonconvex optimization
    Wang, Huijuan
    Xu, Hong-Kun
    [J]. CARPATHIAN JOURNAL OF MATHEMATICS, 2018, 34 (03) : 449 - 457
  • [37] A DELAYED PROXIMAL GRADIENT METHOD WITH LINEAR CONVERGENCE RATE
    Feyzmahdavian, Hamid Reza
    Aytekin, Arda
    Johansson, Mikael
    [J]. 2014 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2014,
  • [38] On the linear convergence rate of Riemannian proximal gradient method
    Choi, Woocheol
    Chun, Changbum
    Jung, Yoon Mo
    Yun, Sangwoon
    [J]. OPTIMIZATION LETTERS, 2024,
  • [39] A new inexact gradient descent method with applications to nonsmooth convex optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2024,
  • [40] A New Inexact Gradient Descent Method with Applications to Nonsmooth Convex Optimization
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Tran, Dat Ba
    [J]. arXiv, 2023,