Proximal Gradient Method with Extrapolation and Line Search for a Class of Non-convex and Non-smooth Problems

被引:3
|
作者
Yang, Lei [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangdong Prov Key Lab Computat Sci, Guangzhou, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Proximal gradient method; Extrapolation; Non-monotone; Line search; Stationary point; KL property; VARIABLE SELECTION; ALGORITHM; CONVEX; MINIMIZATION; OPTIMIZATION; CONVERGENCE; FACTORIZATION; SHRINKAGE;
D O I
10.1007/s10957-023-02348-4
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we consider a class of possibly non-convex and non-smooth optimization problems arising in many contemporary applications such as machine learning, variable selection and image processing. To solve this class of problems, we propose a proximal gradient method with extrapolation and line search (PGels). This method is developed based on a special potential function and successfully incorporates both extrapolation and non-monotone line search, which are two simple and efficient acceleration techniques for the proximal gradient method. Thanks to the non-monotone line search, this method allows more flexibility in choosing the extrapolation parameters and updates them adaptively at each iteration if a certain criterion is not satisfied. Moreover, with proper choices of parameters, our PGels reduces to many existing algorithms. We also show that, under some mild conditions, our line search criterion is well defined and any cluster point of the sequence generated by the PGels is a stationary point of our problem. In addition, by making assumptions on the Kurdyka-Lojasiewicz exponent of the objective in our problem, we further analyze the local convergence rate of two special cases of the PGels, including the widely used nonmonotone proximal gradient method as one case. Finally, we conduct some preliminary numerical experiments for solving the l(1) regularized logistic regression problem and the l(1-2) regularized least squares problem. The obtained numerical results show the promising performance of the PGels and validate the potential advantage of combining two acceleration techniques.
引用
收藏
页码:68 / 103
页数:36
相关论文
共 50 条
  • [1] Proximal Gradient Method with Extrapolation and Line Search for a Class of Non-convex and Non-smooth Problems
    Lei Yang
    Journal of Optimization Theory and Applications, 2024, 200 : 68 - 103
  • [2] A proximal gradient method for control problems with non-smooth and non-convex control cost
    Natemeyer, Carolin
    Wachsmuth, Daniel
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 80 (02) : 639 - 677
  • [3] A proximal gradient method for control problems with non-smooth and non-convex control cost
    Carolin Natemeyer
    Daniel Wachsmuth
    Computational Optimization and Applications, 2021, 80 : 639 - 677
  • [4] Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems
    Yang, Yingzhen
    Yu, Jiahui
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 1253 - 1262
  • [5] Inexact Proximal Gradient Methods for Non-Convex and Non-Smooth Optimization
    Gu, Bin
    Wang, De
    Huo, Zhouyuan
    Huang, Heng
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 3093 - 3100
  • [6] Convergence guarantees for a class of non-convex and non-smooth optimization problems
    Khamaru, Koulik
    Wainwright, Martin J.
    Journal of Machine Learning Research, 2019, 20
  • [7] Convergence guarantees for a class of non-convex and non-smooth optimization problems
    Khamaru, Koulik
    Wainwright, Martin J.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [8] Convergence guarantees for a class of non-convex and non-smooth optimization problems
    Khamaru, Koulik
    Wainwright, Martin J.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [9] Analysis of the gradient method with an Armijo-Wolfe line search on a class of non-smooth convex functions
    Asl, Azam
    Overton, Michael L.
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (02): : 223 - 242
  • [10] Effective Proximal Methods for Non-convex Non-smooth Regularized Learning
    Liang, Guannan
    Tong, Qianqian
    Ding, Jiahao
    Pan, Miao
    Bi, Jinbo
    20TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2020), 2020, : 342 - 351