Convergence analysis of a segmentation algorithm for the evolutionary training of neural networks

被引:0
|
作者
Hüning, H [1 ]
机构
[1] Univ London Imperial Coll Sci Technol & Med, Dept Elect & Elect Engn, London SW7 2BT, England
关键词
D O I
10.1109/ECNN.2000.886222
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In contrast to standard genetic algorithms with generational reproduction, we adopt the viewpoint of the reactor algorithm (Dittrich & Banzhaf, 1998). which is similar to steady-state genetic algorithms, but without ranking. This permits an analysis similar to Eigen's (1971) molecular evolution model. From this viewpoint, we consider combining segments from different populations into one genotype at every time-step, which can be regarded as many-parent combinations with fined crossover points, and is comparable to cooperative evolution (Potter & De Jong, 2000). We present fired-point analysis and phase portraits of the competitive dynamics, with the result that only the first-order (single parent) replicators exhibit global optimisation. A segmentation algorithm is developed that theoretically ensures convergence to the global optimum while Peeping the cooperative or reactor aspect for a better exploration of the search space. The algorithm creates different population islands for such cases of competition that otherwise cannot be solved correctly by the population dynamics. The population islands have different segmentation boundaries, which are generated by combining wed converged components into new segments. This gives first-order replicators that have the appropriate dynamical properties to compete with new solutions. Furthermore, the population islands communicate information about what solution strings have been found already, so new ones can be favoured. strings that have converged on any island Pre taken as the characterisaton of a fitness peak and disallowed afterwards to enforce diversity. This has been found to be successful in the simulation of an evolving neural network. We also present a simulation example where the genotypes on different islands have converged differently. A perspective from this research is to recombine budding blocks explicitly from the different islands, each time the populations converge.
引用
收藏
页码:70 / 81
页数:12
相关论文
共 50 条
  • [31] Evolutionary Compression of Deep Neural Networks for Biomedical Image Segmentation
    Zhou, Yao
    Yen, Gary G.
    Yi, Zhang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (08) : 2916 - 2929
  • [32] A hierarchical evolutionary algorithm for constructing and training wavelet networks
    He, YY
    Chu, FL
    Zhong, BL
    NEURAL COMPUTING & APPLICATIONS, 2002, 10 (04): : 357 - 366
  • [33] A Hierarchical Evolutionary Algorithm for Constructing and Training Wavelet Networks
    Yongyong He
    Fulei Chu
    Binglin Zhong
    Neural Computing & Applications, 2002, 10 : 357 - 366
  • [34] Early Stage Convergence and Global Convergence of Training Mildly Parameterized Neural Networks
    Wang, Mingze
    Ma, Chao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [35] Convergence analysis of mind evolutionary algorithm based on functional analysis
    Xie, Keming
    Qiu, Yuxia
    Xie, Gang
    PROCEEDINGS OF THE FIFTH IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS, VOLS 1 AND 2, 2006, : 707 - 710
  • [36] Combining modular neural networks developed by evolutionary algorithm
    Cho, SB
    PROCEEDINGS OF 1997 IEEE INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION (ICEC '97), 1997, : 647 - 650
  • [37] An Evolutionary Algorithm for Autonomous Agents with Spiking Neural Networks
    Lin, Xianghong
    Shen, Fanqi
    Liu, Kun
    INTELLIGENT COMPUTING THEORIES AND APPLICATION, ICIC 2017, PT I, 2017, 10361 : 37 - 47
  • [38] A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence
    Long Li
    Zhijun Qiao
    Zuqiang Long
    Neural Processing Letters, 2020, 51 : 1093 - 1109
  • [39] A Smoothing Algorithm with Constant Learning Rate for Training Two Kinds of Fuzzy Neural Networks and Its Convergence
    Li, Long
    Qiao, Zhijun
    Long, Zuqiang
    NEURAL PROCESSING LETTERS, 2020, 51 (02) : 1093 - 1109
  • [40] A hybrid training algorithm for feedforward neural networks
    Ben Nasr, Mounir
    Chtourou, Mohamed
    NEURAL PROCESSING LETTERS, 2006, 24 (02) : 107 - 117