Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

被引:0
|
作者
Hu, Mengqi [1 ]
Lou, Yifei [2 ]
Wang, Bao [3 ]
Yan, Ming [4 ,5 ,6 ]
Yang, Xiu [1 ]
Ye, Qiang [7 ]
机构
[1] Lehigh Univ, Dept Ind & Syst Engn, 200 West Packer Ave, Bethlehem, PA 18015 USA
[2] Univ Texas Dallas, Dept Math Sci, 800 W Campbell Rd, Richardson, TX 75080 USA
[3] Univ Utah, Imaging Inst, Dept Math & Sci Comp, 72 Cent Campus Dr, Salt Lake City, UT 84102 USA
[4] Chinese Univ Hong Kong, Sch Data Sci, Shenzhen 2001 Longxiang Blvd, Shenzhen, Guangdong, Peoples R China
[5] Michigan State Univ, Dept Computat Math Sci & Engn, 428 South Shaw Lane, E Lansing, MI 48824 USA
[6] Michigan State Univ, Dept Math, 428 South Shaw Lane, E Lansing, MI 48824 USA
[7] Univ Kentucky, Dept Math, Lexington, KY 40513 USA
关键词
Accelerated gradient momentum; Operator splitting; Fixed step size; Convergence rate; GLOBAL CONVERGENCE; THRESHOLDING ALGORITHM; MINIMIZATION; L(1); REPRESENTATION; DIFFERENCE; SELECTION; RESTART;
D O I
10.1007/s10915-023-02148-y
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex l(1) and the nonconvex l(1) - l(2) functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.
引用
收藏
页数:21
相关论文
共 50 条
  • [31] A descent Dai-Liao conjugate gradient method for nonlinear equations
    Auwal Bala Abubakar
    Poom Kumam
    Numerical Algorithms, 2019, 81 : 197 - 210
  • [32] Sparse GCA and Thresholded Gradient Descent
    Gao, Sheng
    Ma, Zongming
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [33] ON THE UNIFIED DESIGN OF ACCELERATED GRADIENT DESCENT
    Chen, Yuquan
    Wei, Yiheng
    Wang, Yong
    Chen, YangQuan
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2019, VOL 9, 2019,
  • [34] Accelerated Distributed Nesterov Gradient Descent
    Qu, Guannan
    Li, Na
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2020, 65 (06) : 2566 - 2581
  • [35] Hybridization of accelerated gradient descent method
    Milena Petrović
    Vladimir Rakočević
    Nataša Kontrec
    Stefan Panić
    Dejan Ilić
    Numerical Algorithms, 2018, 79 : 769 - 786
  • [36] Algorithmic Instabilities of Accelerated Gradient Descent
    Attia, Amit
    Koren, Tomer
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [37] Hybridization of accelerated gradient descent method
    Petrovic, Milena
    Rakocevic, Vladimir
    Kontrec, Natasa
    Panic, Stefan
    Ilic, Dejan
    NUMERICAL ALGORITHMS, 2018, 79 (03) : 769 - 786
  • [38] Conjugate Gradient Hard Thresholding Pursuit Algorithm for Sparse Signal Recovery
    Zhang, Yanfeng
    Huang, Yunbao
    Li, Haiyan
    Li, Pu
    Fan, Xi'an
    ALGORITHMS, 2019, 12 (02)
  • [39] Accelerated Singular Value Decomposition (ASVD) using momentum based Gradient Descent Optimization
    Raghuwanshi, Sandeep Kumar
    Pateriya, Rajesh Kumar
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2021, 33 (04) : 447 - 452
  • [40] A Sparse Conjugate Gradient Adaptive Filter
    Lee, Ching-Hua
    Rao, Bhaskar D.
    Garudadri, Harinath
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1000 - 1004