Newton-Type Optimal Thresholding Algorithms for Sparse Optimization Problems

被引:3
|
作者
Meng, Nan [1 ]
Zhao, Yun-Bin [2 ]
机构
[1] Univ Birmingham, Sch Math, Birmingham B15 2TT, W Midlands, England
[2] Chinese Univ Hong Kong, Shenzhen Res Inst Big Data, Shenzhen 518172, Peoples R China
基金
中国国家自然科学基金;
关键词
Compressed sensing; Sparse optimization; Newton-type methods; Optimal k-thresholding; Restricted isometry property; SUBSPACE PURSUIT; SIGNAL RECOVERY; COSAMP; NUMBER;
D O I
10.1007/s40305-021-00370-9
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Sparse signals can be possibly reconstructed by an algorithm which merges a traditional nonlinear optimization method and a certain thresholding technique. Different from existing thresholding methods, a novel thresholding technique referred to as the optimal k-thresholding was recently proposed by Zhao (SIAM J Optim 30(1):31-55, 2020). This technique simultaneously performs the minimization of an error metric for the problem and thresholding of the iterates generated by the classic gradient method. In this paper, we propose the so-called Newton-type optimal k-thresholding (NTOT) algorithm which is motivated by the appreciable performance of both Newton-type methods and the optimal k-thresholding technique for signal recovery. The guaranteed performance (including convergence) of the proposed algorithms is shown in terms of suitable choices of the algorithmic parameters and the restricted isometry property (RIP) of the sensing matrix which has been widely used in the analysis of compressive sensing algorithms. The simulation results based on synthetic signals indicate that the proposed algorithms are stable and efficient for signal recovery.
引用
下载
收藏
页码:447 / 469
页数:23
相关论文
共 50 条
  • [21] INEXACT NEWTON-TYPE OPTIMIZATION WITH ITERATED SENSITIVITIES
    Quirynen, Rien
    Gros, Sebastien
    Diehl, Moritz
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (01) : 74 - 95
  • [22] Examples of dual behaviour of Newton-type methods on optimization problems with degenerate constraints
    Izmailov, A. F.
    Solodov, M. V.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2009, 42 (02) : 231 - 264
  • [23] A proximal Newton-type method for equilibrium problems
    P. J. S. Santos
    P. S. M. Santos
    S. Scheimberg
    Optimization Letters, 2018, 12 : 997 - 1009
  • [24] Gauss–Newton-type methods for bilevel optimization
    Jörg Fliege
    Andrey Tin
    Alain Zemkoho
    Computational Optimization and Applications, 2021, 78 : 793 - 824
  • [25] DINO: Distributed Newton-Type Optimization Method
    Crane, Rixon
    Roosta, Fred
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [26] Douglas–Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms
    Andreas Themelis
    Lorenzo Stella
    Panagiotis Patrinos
    Computational Optimization and Applications, 2022, 82 : 395 - 440
  • [27] NEWTON-TYPE CURVILINEAR SEARCH METHOD FOR OPTIMIZATION
    BOTSARIS, CA
    JACOBSON, DH
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1976, 54 (01) : 217 - 229
  • [28] NEWTON-TYPE ALGORITHMS WITH NONMONOTONE LINE SEARCH FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION
    GRIPPO, L
    LAMPARIELLO, F
    LUCIDI, S
    LECTURE NOTES IN CONTROL AND INFORMATION SCIENCES, 1988, 113 : 187 - 196
  • [29] A Greedy Newton-Type Method for Multiple Sparse Constraint Problem
    Sun, Jun
    Kong, Lingchen
    Qu, Biao
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2023, 196 (03) : 829 - 854
  • [30] Newton-Type Methods with the Proximal Gradient Step for Sparse Estimation
    Shimmura R.
    Suzuki J.
    Operations Research Forum, 5 (2)