Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

被引:0
|
作者
Hu, Mengqi [1 ]
Lou, Yifei [2 ]
Wang, Bao [3 ]
Yan, Ming [4 ,5 ,6 ]
Yang, Xiu [1 ]
Ye, Qiang [7 ]
机构
[1] Lehigh Univ, Dept Ind & Syst Engn, 200 West Packer Ave, Bethlehem, PA 18015 USA
[2] Univ Texas Dallas, Dept Math Sci, 800 W Campbell Rd, Richardson, TX 75080 USA
[3] Univ Utah, Imaging Inst, Dept Math & Sci Comp, 72 Cent Campus Dr, Salt Lake City, UT 84102 USA
[4] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 2001 Longxiang Blvd, Shenzhen, Guangdong, Peoples R China
[5] Michigan State Univ, Dept Computat Math Sci & Engn, 428 South Shaw Lane, E Lansing, MI 48824 USA
[6] Michigan State Univ, Dept Math, 428 South Shaw Lane, E Lansing, MI 48824 USA
[7] Univ Kentucky, Dept Math, Lexington, KY 40513 USA
关键词
Accelerated gradient momentum; Operator splitting; Fixed step size; Convergence rate; GLOBAL CONVERGENCE; THRESHOLDING ALGORITHM; MINIMIZATION; L(1); REPRESENTATION; DIFFERENCE; SELECTION; RESTART;
D O I
10.1007/s10915-023-02148-y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex l(1) and the nonconvex l(1) - l(2) functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
引用
收藏
页数:21
相关论文
共 50 条
  • [41] Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
    Jinkui Liu
    Shaoheng Wang
    Journal of Inequalities and Applications, 2011
  • [42] A Conjugate Gradient Method With Sufficient Descent And Global Convergence For Unconstrained Nonlinear Optimization
    Liu, Hailin
    Cheng, Sui Sun
    Li, Xiaoyong
    APPLIED MATHEMATICS E-NOTES, 2011, 11 : 139 - 147
  • [43] A descent nonlinear conjugate gradient method for large-scale unconstrained optimization
    Yu, Gaohang
    Zhao, Yanlin
    Wei, Zengxin
    APPLIED MATHEMATICS AND COMPUTATION, 2007, 187 (02) : 636 - 643
  • [44] Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
    Dong, Xiao Liang
    Liu, Hongwei
    Xu, Yin Ling
    Yang, Xi Mei
    OPTIMIZATION LETTERS, 2015, 9 (07) : 1421 - 1432
  • [45] Some nonlinear conjugate gradient methods with sufficient descent condition and global convergence
    Xiao Liang Dong
    Hongwei Liu
    Yin Ling Xu
    Xi Mei Yang
    Optimization Letters, 2015, 9 : 1421 - 1432
  • [46] Modified nonlinear conjugate gradient method with sufficient descent condition for unconstrained optimization
    Liu, Jinkui
    Wang, Shaoheng
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2011, : 1 - 12
  • [47] A conjugate gradient method with sufficient descent property
    Hao Liu
    Haijun Wang
    Xiaoyan Qian
    Feng Rao
    Numerical Algorithms, 2015, 70 : 269 - 286
  • [48] Sufficient Descent Riemannian Conjugate Gradient Methods
    Hiroyuki Sakai
    Hideaki Iiduka
    Journal of Optimization Theory and Applications, 2021, 190 : 130 - 150
  • [49] Sufficient Descent Riemannian Conjugate Gradient Methods
    Sakai, Hiroyuki
    Iiduka, Hideaki
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 190 (01) : 130 - 150
  • [50] A Spectral Conjugate Gradient Method with Descent Property
    Jian, Jinbao
    Yang, Lin
    Jiang, Xianzhen
    Liu, Pengjie
    Liu, Meixing
    MATHEMATICS, 2020, 8 (02)