Smoothing L1 regularization for stochastic configuration networks

被引:0
|
作者
Liu J.-J. [1 ,2 ]
Liu Y.-F. [2 ]
Ma Y.-H. [3 ]
Fu Y. [4 ]
机构
[1] Department of Basic Courses, Shenyang Institute of Technology, Shenfu New District
[2] Liaoning Key Laboratory of Information Physics Fusion and Intelligent Manufacturing for CNC Machine, Shenyang Institute of Technology, Shenfu New District
[3] School of Mechanical Engineering and Automation, Shenyang Institute of Technology, Shenfu New District
[4] State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, Shenyang
来源
Kongzhi yu Juece/Control and Decision | 2024年 / 39卷 / 03期
关键词
alternating direction multiplier method; convergence analysis; data feature; generalization capability; smoothing regularization; stochastic configuration networks;
D O I
10.13195/j.kzyjc.2022.1612
中图分类号
学科分类号
摘要
In order to improve the generalization capability of stochastic configuration networks (SCNs), a smooth L1 regularization method for SCNs is proposed. Aiming at the defect of local non-differentiability of the L1 regularization operator, smoothing is carried out in the neighborhood of non-smooth points of the curve. The convex error function of the SCN is constructed on this basis, and an algorithm for incremental calculation of the weights of the SCN is proposed. Furthermore, the global optimization algorithm is proposed based on the alternating direction multiplier method, and the convergence of the algorithm is analyzed theoretically. Compared with the sparsity of L1 regularization and the uniform reduction of parameters by L2 regularization, the proposed method retains all features of the data according to the degree of importance, the parameters are not only kept in a small range, but also have hierarchical distribution, so that the network has better generalization ability. Finally, the feasibility and effectiveness of the proposed method are verified by some numerical simulations. © 2024 Northeast University. All rights reserved.
引用
收藏
页码:813 / 818
页数:5
相关论文
共 25 条
  • [1] Burden F, Winkler D., Bayesian regularization of neural networks, Methods in Molecular Biology, 458, pp. 25-44, (2008)
  • [2] Ticknor J L., A Bayesian regularized artificial neural network for stock market forecasting, Expert Systems with Applications, 40, 14, pp. 5501-5506, (2013)
  • [3] Han M, Li D., An norm 1 regularization term ELM algorithm based on surrogate function and Bayesian framework, Acta Auto Sin, 37, 11, pp. 1344-1350, (2011)
  • [4] Cen L, Yu Z L, Dong M H., Speech emotion recognition system based on L<sub>1</sub> regularized linear regression and decision fusion, Affective Computing and Intelligent Interaction, pp. 332-340, (2011)
  • [5] Zou H, Hastie T., Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society - Series B: Statistical Methodology, 67, 2, pp. 301-320, (2005)
  • [6] Zou H., The adaptive lasso and its oracle properties, Journal of the American Statistical Association, 101, 476, pp. 1418-1429, (2006)
  • [7] Zhao P, Yu B., Stagewise lasso, Journal of Machine Learning Research, 8, 12, pp. 2701-2726, (2007)
  • [8] Zhang Z, Xu Y, Yang J, Et al., A survey of sparse representation: Algorithms and applications, IEEE Access, 3, pp. 490-530, (2015)
  • [9] Xu Z B, Zhang H, Wang Y, Et al., L<sub>1/2</sub> regularizer, China Information Science, 53, 6, pp. 1159-1169, (2010)
  • [10] Xu Z B, Chang X Y, Xu F M, Et al., L<sub>1/2</sub> regularization: A thresholding representation theory and a fast solver, IEEE Transactions on Neural Networks and Learning Systems, 23, 7, pp. 1013-1027, (2012)