Neurodynamic Algorithms With Finite/Fixed-Time Convergence for Sparse Optimization via l1 Regularization

被引:5
|
作者
Wen, Hongsong [1 ]
He, Xing [1 ]
Huang, Tingwen [2 ]
Yu, Junzhi [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligent, Chongqing 400715, Peoples R China
[2] Texas A&M Univ Qatar, Dept Math, Doha, Qatar
[3] Peking Univ, Coll Engn, Dept Adv Mfg & Robot, State Key Lab Turbulence & Complex Syst, Beijing 100871, Peoples R China
[4] Peking Univ, Nanchang Innovat Inst, Nanchang 330224, Peoples R China
关键词
l(1) regularization; finite-time stability (FTS); fixed-time stability (FxTS); locally competitive algorithm (LCA); sparse optimization; SIGNAL RECONSTRUCTION; RECOVERY; PROJECTION;
D O I
10.1109/TSMC.2023.3304850
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse optimization problems have been successfully applied to a wide range of research areas, and useful insights and elegant methods for proving the stability and convergence of neurodynamic algorithms have been yielded in previous work. This article develops several neurodynamic algorithms for sparse signal recovery by solving the l(1) regularization problem. First, in the framework of the locally competitive algorithm (LCA), modified LCA (MLCA) with finite-time convergence and MLCA with fixed-time convergence are designed. Then, the sliding-mode control (SMC) technique is introduced and modified, i.e., modified SMC (MSMC), which is combined with LCA to design MSMC-LCA with finite-time convergence and MSMC-LCA with fixed-time convergence. It is shown that the solutions of the proposed neurodynamic algorithms exist and are unique under the observation matrix satisfying restricted isometry property (RIP) condition, while finite-time or fixed-time convergence to the optimal points is shown via Lyapunov-based analysis. In addition, combining the notions of finite-time stability (FTS) and fixed-time stability (FxTS), upper bounds on the convergence time of the proposed neurodynamic algorithms are given, and the convergence results obtained for the MLCA and MSMC-LCA with fixed-time convergence are shown to be independent of the initial conditions. Finally, simulation experiments of signal recovery and image recovery are carried out to demonstrate the superior performance of the proposed neurodynamic algorithms.
引用
收藏
页码:131 / 142
页数:12
相关论文
共 50 条
  • [21] Sparse minimal learning machines via l1/2 norm regularization
    Dias, Madson L. D.
    Freire, Ananda L.
    Souza Junior, Amauri H.
    da Rocha Neto, Ajalmar R.
    Gomes, Joao P. P.
    2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 206 - 211
  • [22] A fixed-time converging neurodynamic approach with time-varying coefficients for l1-minimization problem
    Xu, Jing
    Li, Chuandong
    He, Xing
    Wen, Hongsong
    Zhang, Xiaoyu
    INFORMATION SCIENCES, 2024, 654
  • [23] Fixed-Time Algorithms for Time-Varying Convex Optimization
    Hong, Huifen
    Yu, Wenwu
    Jiang, Guo-Ping
    Wang, He
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2023, 70 (02) : 616 - 620
  • [24] αl1 - βl2 regularization for sparse recovery
    Ding, Liang
    Han, Weimin
    INVERSE PROBLEMS, 2019, 35 (12)
  • [25] Parameter choices for sparse regularization with the l1 norm
    Liu, Qianru
    Wang, Rui
    Xu, Yuesheng
    Yan, Mingsong
    INVERSE PROBLEMS, 2023, 39 (02)
  • [26] Sparse Hopfield network reconstruction with l1 regularization
    Huang, Haiping
    EUROPEAN PHYSICAL JOURNAL B, 2013, 86 (11):
  • [27] A Fixed-Time Proximal Gradient Neurodynamic Network With Time-Varying Coefficients for Composite Optimization Problems and Sparse Optimization Problems With Log-Sum Function
    Xu, Jing
    Li, Chuandong
    He, Xing
    Wen, Hongsong
    Ju, Xingxing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [28] Fixed-Time Convergence in Continuous-Time Optimization: A Fractional Approach
    Chen, Yuquan
    Wang, Fumian
    Wang, Bing
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 631 - 636
  • [29] L1/2 regularization learning for smoothing interval neural networks: Algorithms and convergence analysis
    Yang, Dakun
    Liu, Yan
    NEUROCOMPUTING, 2018, 272 : 122 - 129
  • [30] Sparse FIR Filter Design via Partial L1 Optimization
    Zheng, Li
    Jiang, Aimin
    Kwan, Hon Keung
    2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017,