A Gradient-Based Particle-Bat Algorithm for Stochastic Configuration Network

被引:4
|
作者
Liu, Jingjing [1 ,2 ]
Liu, Yefeng [2 ]
Zhang, Qichun [3 ]
机构
[1] Shenyang Inst Technol, Dept Basic Courses, Shenfu Demonstrat Area, Shenyang 113122, Peoples R China
[2] Shenyang Inst Technol, Liaoning Key Lab Informat Phys Fus & Intelligent M, Shenfu Demonstrat Area, Shenyang 113122, Peoples R China
[3] Univ Bradford, Dept Comp Sci, Bradford BD7 1DP, England
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 05期
基金
美国国家科学基金会;
关键词
bat algorithm; gradient; PSO algorithm; stochastic configuration networks; EXCITATION-CURRENT; APPROXIMATION;
D O I
10.3390/app13052878
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Stochastic configuration network (SCN) is a mathematical model of incremental generation under a supervision mechanism, which has universal approximation property and advantages in data modeling. However, the efficiency of SCN is affected by some network parameters. An optimized searching algorithm for the input weights and biases is proposed in this paper. An optimization model with constraints is first established based on the convergence theory and inequality supervision mechanism of SCN; Then, a hybrid bat-particle swarm optimization algorithm (G-BAPSO) based on gradient information is proposed under the framework of PSO algorithm, which mainly uses gradient information and local adaptive adjustment mechanism characterized by pulse emission frequency to improve the searching ability. The algorithm optimizes the input weights and biases to improve the convergence rate of the network. Simulation results over some datasets demonstrate the feasibility and validity of the proposed algorithm. The training RMSE of G-BAPSO-SCN increased by 5.57x10(-5) and 3.2x10(-3) compared with that of SCN in the two regression experiments, and the recognition accuracy of G-BAPSO-SCN increased by 0.07% on average in the classification experiments.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] A self-adaptive gradient-based particle swarm optimization algorithm with dynamic population topology ?
    Zhang, Daren
    Ma, Gang
    Deng, Zhuoran
    Wang, Qiao
    Zhang, Guike
    Zhou, Wei
    APPLIED SOFT COMPUTING, 2022, 130
  • [22] Gradient-based stochastic extremum seeking for static maps with delays
    Yang, Xiao
    Yin, Chun
    Chang, Yuhua
    Wang, Peng
    Huang, Xuegang
    Zhong, Shotuning
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 674 - 679
  • [23] Gradient-based optimizer: A new metaheuristic optimization algorithm
    Ahmadianfar, Iman
    Bozorg-Haddad, Omid
    Chu, Xuefeng
    INFORMATION SCIENCES, 2020, 540 : 131 - 159
  • [24] A gradient-based algorithm for semiparametric models with missing covariates
    Seo, Byungtae
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2011, 81 (04) : 381 - 390
  • [25] GRADIENT-BASED STOCHASTIC OPTIMIZATION METHODS IN BAYESIAN EXPERIMENTAL DESIGN
    Huan, Xun
    Marzouk, Youssef M.
    INTERNATIONAL JOURNAL FOR UNCERTAINTY QUANTIFICATION, 2014, 4 (06) : 479 - 510
  • [26] A novel conjugate gradient-based source localization algorithm
    Semira, Hichem
    Belkacemi, Hocine
    Doghmane, Noureddine
    2007 9TH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, VOLS 1-3, 2007, : 65 - +
  • [27] A BLOCK GRADIENT-BASED ALGORITHM FOR ADAPTIVE IIR FILTERING
    XI, JT
    CHICHARO, JF
    SIGNAL PROCESSING, 1995, 43 (02) : 223 - 228
  • [28] Time/order update gradient-based adaptive algorithm
    Mayyas, K
    Aboulnasr, T
    ISSPA 96 - FOURTH INTERNATIONAL SYMPOSIUM ON SIGNAL PROCESSING AND ITS APPLICATIONS, PROCEEDINGS, VOLS 1 AND 2, 1996, : 296 - 299
  • [29] An efficient gradient-based optimization algorithm for mechanical systems
    Kegl, M
    Butinar, BJ
    Kegl, B
    COMMUNICATIONS IN NUMERICAL METHODS IN ENGINEERING, 2002, 18 (05): : 363 - 371
  • [30] A Gradient-Based Algorithm to Deceive Deep Neural Networks
    Xie, Tianying
    Li, Yantao
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT IV, 2019, 1142 : 57 - 65