Stochastic configuration networks with group lasso regularization

被引:0
|
作者
Wang, Yang [1 ]
Yang, Guanci [2 ]
Zhang, Chenglong [3 ]
Wu, Yongming [1 ]
机构
[1] Guizhou Univ, State Key Lab Publ Big Data, Guiyang 550025, Guizhou, Peoples R China
[2] Guizhou Univ, Key Lab Adv Mfg Technol, Minist Educ, Guiyang 550025, Guizhou, Peoples R China
[3] Chinese Univ Hong Kong Shenzhen, Sch Data Sci, Shenzhen 518172, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Stochastic configuration networks; Block incremental method; Moore-Penrose generalized inverse; Group lasso; Hidden node pruning;
D O I
10.1016/j.ins.2024.120860
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Stochastic configuration networks (SCNs) construct randomized learner models incrementally in a node-by-node format under the guidance of its supervisory mechanism. Block-incremental SCNs (BSCN) extend the original SCNs with block increments to effectively reduce the number of iterations required during model building. Yet, two new issues emerge: the computationally expensive Moore-Penrose generalized inverse in inequality constraints, and some potential redundant hidden nodes. To address these limitations, this study presents efficient blockincremental SCNs (EBSCN) with group lasso regularization, termed EBSCNGL. The hidden block is treated as a specialized form of hidden node, and the output vector is directly replaced with the output matrix in the weights formula of SC-I (the first algorithmic implementation of SCNs) to evaluate the output weights of the newly added hidden block. Subsequently, a new set of inequalities without matrix generalized inverse is presented to ensure the universal approximation capability of EBSCN. Moreover, group lasso regularization is introduced to prune redundant nodes of the hidden layer. We further transform its regularized least-squares solution into an efficient form with proved convergence based on the Woodbury matrix identity. Empirical results on function approximation, benchmark classification, and a practical industrial application verify the efficiency and sparsity of our proposed method.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Feature Selection for Neural Networks Using Group Lasso Regularization
    Zhang, Huaqing
    Wang, Jian
    Sun, Zhanquan
    Zurada, Jacek M.
    Pal, Nikhil R.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2020, 32 (04) : 659 - 673
  • [2] Feature Selection for Fuzzy Neural Networks using Group Lasso Regularization
    Gao, Tao
    Bai, Xiao
    Zhang, Liang
    Wang, Jian
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [3] Smoothing L1 regularization for stochastic configuration networks
    Liu J.-J.
    Liu Y.-F.
    Ma Y.-H.
    Fu Y.
    Kongzhi yu Juece/Control and Decision, 2024, 39 (03): : 813 - 818
  • [4] Convergence analyses on sparse feedforward neural networks via group lasso regularization
    Wang, Jian
    Cai, Qingling
    Chang, Qingquan
    Zurada, Jacek M.
    INFORMATION SCIENCES, 2017, 381 : 250 - 269
  • [5] AUTOMATIC NODE SELECTION FOR DEEP NEURAL NETWORKS USING GROUP LASSO REGULARIZATION
    Ochiai, Tsubasa
    Matsuda, Shigeki
    Watanabe, Hideyuki
    Katagiri, Shigeru
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5485 - 5489
  • [6] Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
    Jan Klosa
    Noah Simon
    Pål Olof Westermark
    Volkmar Liebscher
    Dörte Wittenburg
    BMC Bioinformatics, 21
  • [7] Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
    Klosa, Jan
    Simon, Noah
    Westermark, Pal Olof
    Liebscher, Volkmar
    Wittenburg, Doerte
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [8] Learning Interactions via Hierarchical Group-Lasso Regularization
    Lim, Michael
    Hastie, Trevor
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2015, 24 (03) : 627 - 654
  • [9] BP Neural Network Feature Selection Based on Group Lasso Regularization
    Liu, Tiqian
    Xiao, Jiang-Wen
    Huang, Zhengyi
    Kong, Erdan
    Liang, Yuntao
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 2786 - 2790
  • [10] ITERATIVELY REWEIGHTED GROUP LASSO BASED ON LOG-COMPOSITE REGULARIZATION
    Ke, Chengyu
    Ahn, Miju
    Shin, Sunyoung
    Lou, Yifei
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2021, 43 (05): : S655 - S678