Lagrange Programming Neural Network for Nondifferentiable Optimization Problems in Sparse Approximation

被引:50
|
作者
Feng, Ruibin [1 ]
Leung, Chi-Sing [1 ]
Constantinides, Anthony G. [2 ]
Zeng, Wen-Jun [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Hong Kong, Peoples R China
[2] Imperial Coll London, Commun & Signal Proc, London SW7 2AZ, England
关键词
Lagrange programming neural networks (LPNNs); locally competitive algorithm (LCA); optimization; VARIATIONAL-INEQUALITIES; CONSTRAINED OPTIMIZATION; QUADRATIC OPTIMIZATION; ATOMIC DECOMPOSITION; BOUND CONSTRAINTS; CONVERGENCE; SYSTEMS; MINIMIZATION; STABILITY; REAL;
D O I
10.1109/TNNLS.2016.2575860
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The major limitation of the Lagrange programming neural network (LPNN) approach is that the objective function and the constraints should be twice differentiable. Since sparse approximation involves nondifferentiable functions, the original LPNN approach is not suitable for recovering sparse signals. This paper proposes a new formulation of the LPNN approach based on the concept of the locally competitive algorithm (LCA). Unlike the classical LCA approach which is able to solve unconstrained optimization problems only, the proposed LPNN approach is able to solve the constrained optimization problems. Two problems in sparse approximation are considered. They are basis pursuit (BP) and constrained BP denoise (CBPDN). We propose two LPNN models, namely, BP-LPNN and CBPDN-LPNN, to solve these two problems. For these two models, we show that the equilibrium points of the models are the optimal solutions of the two problems, and that the optimal solutions of the two problems are the equilibrium points of the two models. Besides, the equilibrium points are stable. Simulations are carried out to verify the effectiveness of these two LPNN models.
引用
收藏
页码:2395 / 2407
页数:13
相关论文
共 50 条
  • [41] On Nondifferentiable and Nonconvex Vector Optimization Problems
    Q. H. Ansari
    J> C> Yao
    Journal of Optimization Theory and Applications, 2000, 106 : 475 - 488
  • [42] Optimal multi-reservoir network control by augmented Lagrange programming neural network
    Sharma, V.
    Jha, R.
    Naresh, R.
    APPLIED SOFT COMPUTING, 2007, 7 (03) : 783 - 790
  • [43] SparsePOP - A sparse semidefinite programming relaxation of polynomial optimization problems
    Waki, Hayato
    Kim, Sunyoung
    Kojima, Masakazu
    Muramatsu, Masakazu
    Sugimoto, Hiroshi
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2008, 35 (02): : 1 - 13
  • [44] Hardness results for neural network approximation problems
    Bartlett, P
    Ben-David, S
    COMPUTATIONAL LEARNING THEORY, 1999, 1572 : 50 - 62
  • [45] Hardness results for neural network approximation problems
    Bartlett, PL
    Ben-David, S
    THEORETICAL COMPUTER SCIENCE, 2002, 284 (01) : 53 - 66
  • [46] An outer-approximation guided optimization approach for constrained neural network inverse problems
    Myun-Seok Cheon
    Mathematical Programming, 2022, 196 : 173 - 202
  • [47] An outer-approximation guided optimization approach for constrained neural network inverse problems
    Cheon, Myun-Seok
    MATHEMATICAL PROGRAMMING, 2022, 196 (1-2) : 173 - 202
  • [48] A smooth gradient approximation neural network for general constrained nonsmooth nonconvex optimization problems
    Liu, Na
    Jia, Wenwen
    Qin, Sitian
    NEURAL NETWORKS, 2025, 184
  • [49] A neural network for solving nonlinear programming problems
    Chen, KZ
    Leung, Y
    Leung, KS
    Gao, XB
    NEURAL COMPUTING & APPLICATIONS, 2002, 11 (02): : 103 - 111
  • [50] Delayed Lagrange neural network for sparse signal reconstruction under compressive sampling
    Li, Yuan-Min
    Wei, Deyun
    OPTIK, 2016, 127 (18): : 7077 - 7082