Neurodynamic Algorithms With Finite/Fixed-Time Convergence for Sparse Optimization via l1 Regularization

被引:5
|
作者
Wen, Hongsong [1 ]
He, Xing [1 ]
Huang, Tingwen [2 ]
Yu, Junzhi [3 ,4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligent, Chongqing 400715, Peoples R China
[2] Texas A&M Univ Qatar, Dept Math, Doha, Qatar
[3] Peking Univ, Coll Engn, Dept Adv Mfg & Robot, State Key Lab Turbulence & Complex Syst, Beijing 100871, Peoples R China
[4] Peking Univ, Nanchang Innovat Inst, Nanchang 330224, Peoples R China
关键词
l(1) regularization; finite-time stability (FTS); fixed-time stability (FxTS); locally competitive algorithm (LCA); sparse optimization; SIGNAL RECONSTRUCTION; RECOVERY; PROJECTION;
D O I
10.1109/TSMC.2023.3304850
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Sparse optimization problems have been successfully applied to a wide range of research areas, and useful insights and elegant methods for proving the stability and convergence of neurodynamic algorithms have been yielded in previous work. This article develops several neurodynamic algorithms for sparse signal recovery by solving the l(1) regularization problem. First, in the framework of the locally competitive algorithm (LCA), modified LCA (MLCA) with finite-time convergence and MLCA with fixed-time convergence are designed. Then, the sliding-mode control (SMC) technique is introduced and modified, i.e., modified SMC (MSMC), which is combined with LCA to design MSMC-LCA with finite-time convergence and MSMC-LCA with fixed-time convergence. It is shown that the solutions of the proposed neurodynamic algorithms exist and are unique under the observation matrix satisfying restricted isometry property (RIP) condition, while finite-time or fixed-time convergence to the optimal points is shown via Lyapunov-based analysis. In addition, combining the notions of finite-time stability (FTS) and fixed-time stability (FxTS), upper bounds on the convergence time of the proposed neurodynamic algorithms are given, and the convergence results obtained for the MLCA and MSMC-LCA with fixed-time convergence are shown to be independent of the initial conditions. Finally, simulation experiments of signal recovery and image recovery are carried out to demonstrate the superior performance of the proposed neurodynamic algorithms.
引用
收藏
页码:131 / 142
页数:12
相关论文
共 50 条
  • [41] Sparse Gabor Time-Frequency Representation Based on l1/2-l2 Regularization
    Li, Rui
    Zhou, Jian
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2019, 38 (10) : 4700 - 4722
  • [42] A recursive least squares algorithm with l1 regularization for sparse representation
    Liu, Di
    Baldi, Simone
    Liu, Quan
    Yu, Wenwu
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (02)
  • [43] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [44] Sparse kernel logistic regression based on L1/2 regularization
    Xu Chen
    Peng ZhiMing
    Jing WenFeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) : 1 - 16
  • [45] l1/2,1 group sparse regularization for compressive sensing
    Liu, Shengcai
    Zhang, Jiangshe
    Liu, Junmin
    Yin, Qingyan
    SIGNAL IMAGE AND VIDEO PROCESSING, 2016, 10 (05) : 861 - 868
  • [46] Transformed l1 regularization for learning sparse deep neural networks
    Ma, Rongrong
    Miao, Jianyu
    Niu, Lingfeng
    Zhang, Peng
    NEURAL NETWORKS, 2019, 119 : 286 - 298
  • [47] Sparse kernel logistic regression based on L1/2 regularization
    XU Chen
    PENG ZhiMing
    JING WenFeng
    Science China(Information Sciences), 2013, 56 (04) : 75 - 90
  • [48] Accelerating Distributed Optimization via Fixed-Time Convergent Flows
    Garg, Kunal
    Baranwal, Mayank
    IFAC PAPERSONLINE, 2023, 56 (02): : 1235 - 1240
  • [49] Sparse Feature Grouping based on l1/2 Norm Regularization
    Mao, Wentao
    Xu, Wentao
    Li, Yuan
    2018 ANNUAL AMERICAN CONTROL CONFERENCE (ACC), 2018, : 1045 - 1051
  • [50] A Neurodynamic Optimization Approach for L1 Minimization with Application to Compressed Image Reconstruction
    Dai, Chengchen
    Che, Hangjun
    Leung, Man-Fai
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2021, 30 (01)