A pruning feedforward small-world neural network by dynamic sparse regularization with smoothing l1/2 norm for nonlinear system modeling

被引:4
|
作者
Li, Wenjing [1 ]
Chu, Minghui
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
基金
中国国家自然科学基金;
关键词
Feedforward small-world neural network; Weight pruning; Dynamic regularization; Smoothingl1; 2; norm; CONVERGENCE; APPROXIMATION; ALGORITHM;
D O I
10.1016/j.asoc.2023.110133
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aiming to solve the problem of the relatively large architecture for the small-world neural network and improve its generalization ability, we propose a pruning feedforward small-world neural network based on a dynamic regularization method with the smoothing l1/2 norm (PFSWNN-DSRL1/2) and apply it to nonlinear system modeling. A feedforward small-world neural network is first constructed by the rewiring rule of Watts-Strogatz. By minimizing the modified error function added with a smoothing l1/2 norm, redundant weights are pruned to generate a sparse architecture. A dynamic adjusting strategy is further designed for the regularization strength to balance the tradeoff between the training accuracy and the sparsity. Several experiments are carried out to evaluate the performance of the proposed PFSWNN-DSRL1/2 on nonlinear system modeling. The results show that the PFSWNN-DSRL1/2 can achieve the satisfactory modeling accuracy with an average of 17% pruned weights. The comparative results demonstrate that the generalization performance of the proposed model is improved by 8.1% relative to the baseline method (FSWNN) but with a sparse structure, and the pruning does not degenerate its small-world property.(c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 39 条
  • [21] A Smoothing Inertial Neural Network for Sparse Signal Reconstruction with Noise Measurements via Lp -L1 minimization
    Jiang, Lieqin
    He, Xing
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2022, 41 (11) : 6295 - 6313
  • [22] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
  • [23] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
  • [24] A novel method for financial distress prediction based on sparse neural networks with L1/2 regularization
    Chen, Ying
    Guo, Jifeng
    Huang, Junqin
    Lin, Bin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (07) : 2089 - 2103
  • [25] Double parallel feedforward neural network based on extreme learning machine with L1/2 regularizer
    Khan, Atlas
    Yang, Jie
    Wu, Wei
    NEUROCOMPUTING, 2014, 128 : 113 - 118
  • [26] Molecular Sparse Representation by a 3D Ellipsoid Radial Basis Function Neural Network via L1 Regularization
    Gui, Sheng
    Chen, Zhaodi
    Lu, Benzhuo
    Chen, Minxin
    JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2020, 60 (12) : 6054 - 6064
  • [27] Blade dynamic strain non-intrusive measurement using L1/2-norm regularization and transmissibility
    Ao, Chunyan
    Qiao, Baijie
    Chen, Lei
    Xu, Jinghui
    Liu, Meiru
    Chen, Xuefeng
    MEASUREMENT, 2022, 190
  • [28] Method for Sparse Representation of Complex Data Based on Overcomplete Basis, l1 Norm, and Neural MFNN-like Network
    Panokin, Nikolay V.
    Averin, Artem V.
    Kostin, Ivan A.
    Karlovskiy, Alexander V.
    Orelkina, Daria I.
    Nalivaiko, Anton Yu.
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [29] Neural Network Optimization Algorithm Model Combining L1/2 Regularization and Extreme Learning Machine
    Qi, Anzhi
    PROCEEDINGS OF THE 2018 3RD INTERNATIONAL WORKSHOP ON MATERIALS ENGINEERING AND COMPUTER SCIENCES (IWMECS 2018), 2018, 78 : 106 - 109
  • [30] A new Sigma-Pi-Sigma neural network based on L1 and L2 regularization and applications
    Jiao, Jianwei
    Su, Keqin
    AIMS MATHEMATICS, 2024, 9 (03): : 5995 - 6012