Research Progress on Stochastic Configuration Network

被引:0
|
作者
Zhang C.-L. [1 ]
Ding S.-F. [1 ,2 ]
Guo L.-L. [1 ,2 ]
Zhang J. [1 ,2 ]
机构
[1] School of Computer Science and Technology, China University of Mining and Technology, Xuzhou
[2] Engineering Research Center of Mine Digitization, China University of Mining and Technology, Ministry of Education, Xuzhou
来源
Ruan Jian Xue Bao/Journal of Software | 2024年 / 35卷 / 05期
关键词
deep learning; neural network; randomized learning; research progress; stochastic configuration network (SCN);
D O I
10.13328/j.cnki.jos.006804
中图分类号
学科分类号
摘要
Stochastic configuration network (SCN), as an emerging incremental neural network model, is different from other randomized neural network methods. It can configure the parameters of hidden layer nodes through supervision mechanisms, thereby ensuring the fast convergence performance of SCN. Due to the advantages of high learning efficiency, low human intervention, and strong generalization ability, SCN has attracted a large number of national and international scholars and developed rapidly since it was proposed in 2017. In this study, SCN research is summarized from the aspects of basic theories, typical algorithm variants, application fields, and future research directions of SCN. Firstly, the algorithm principles, universal approximation capacity, and advantages of SCN are analyzed theoretically. Secondly, typical variants of SCN are studied, such as DeepSCN, 2DSCN, Robust SCN, Ensemble SCN, Distributed SCN, Parallel SCN, and Regularized SCN. Then, the applications of SCN in different fields, including hardware implementation, computer vision, medical data analysis, fault detection and diagnosis, and system modeling and prediction are introduced. Finally, the development potential of SCN in convolutional neural network architectures, semi-supervised learning, unsupervised learning, multi-view learning, fuzzy neural network, and recurrent neural network is pointed out. © 2024 Chinese Academy of Sciences. All rights reserved.
引用
收藏
页码:2379 / 2399
页数:20
相关论文
共 120 条
  • [21] Gorban AN, Tyukin IY, Prokhorov DV, Sofeikov KI., Approximation with random bases: Pro et Contra, Information Sciences, pp. 129-145, (2016)
  • [22] Li M, Wang DH., Insights into randomized algorithms for neural networks: Practical issues and common pitfalls, Information Sciences, pp. 170-178, (2017)
  • [23] Wang DH, Li M., Stochastic configuration networks: Fundamentals and algorithms, IEEE Trans. on Cybernetics, 47, 10, pp. 3466-3479, (2017)
  • [24] Wang DH, Li M., Deep stochastic configuration networks with universal approximation property, Proc. of the 2018 Int’l Joint Conf. on Neural Networks, pp. 1-8, (2018)
  • [25] Gao JB, Guo Y, Wang ZY., Matrix neural networks, Proc. of the 14th Int’l Symp. on Advances in Neural Networks, pp. 313-320, (2017)
  • [26] Liu SM, Sun YF, Hu YL, Gao JB, Ju FJ, Yin BC., Matrix variate RBM model with Gaussian distributions, Proc. of the 2017 Int’l Joint Conf. on Neural Networks, pp. 808-815, (2017)
  • [27] Lu J, Zhao JW, Cao FL., Extended feed forward neural networks with random weights for face recognition, Neurocomputing, 136, pp. 96-102, (2014)
  • [28] Li M, Wang DH., 2-D stochastic configuration networks for image data analytics, IEEE Trans. on Cybernetics, 51, 1, pp. 359-372, (2021)
  • [29] Simonyan K, Zisserman A., Very deep convolutional networks for large-scale image recognition, (2015)
  • [30] He KM, Zhang XY, Ren SQ, Sun J., Deep residual learning for image recognition, Proc. of the 2016 IEEE Conf. on Computer Vision and Pattern Recognition, pp. 770-778, (2016)