Convergence analysis of a segmentation algorithm for the evolutionary training of neural networks

被引:0
|
作者
Hüning, H [1 ]
机构
[1] Univ London Imperial Coll Sci Technol & Med, Dept Elect & Elect Engn, London SW7 2BT, England
关键词
D O I
10.1109/ECNN.2000.886222
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In contrast to standard genetic algorithms with generational reproduction, we adopt the viewpoint of the reactor algorithm (Dittrich & Banzhaf, 1998). which is similar to steady-state genetic algorithms, but without ranking. This permits an analysis similar to Eigen's (1971) molecular evolution model. From this viewpoint, we consider combining segments from different populations into one genotype at every time-step, which can be regarded as many-parent combinations with fined crossover points, and is comparable to cooperative evolution (Potter & De Jong, 2000). We present fired-point analysis and phase portraits of the competitive dynamics, with the result that only the first-order (single parent) replicators exhibit global optimisation. A segmentation algorithm is developed that theoretically ensures convergence to the global optimum while Peeping the cooperative or reactor aspect for a better exploration of the search space. The algorithm creates different population islands for such cases of competition that otherwise cannot be solved correctly by the population dynamics. The population islands have different segmentation boundaries, which are generated by combining wed converged components into new segments. This gives first-order replicators that have the appropriate dynamical properties to compete with new solutions. Furthermore, the population islands communicate information about what solution strings have been found already, so new ones can be favoured. strings that have converged on any island Pre taken as the characterisaton of a fitness peak and disallowed afterwards to enforce diversity. This has been found to be successful in the simulation of an evolving neural network. We also present a simulation example where the genotypes on different islands have converged differently. A perspective from this research is to recombine budding blocks explicitly from the different islands, each time the populations converge.
引用
收藏
页码:70 / 81
页数:12
相关论文
共 50 条
  • [1] An algorithm for fast convergence in training neural networks
    Wilamowski, BM
    Iplikci, S
    Kaynak, O
    Efe, MÖ
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 1778 - 1782
  • [2] A Robust Evolutionary Algorithm for Training Neural Networks
    Jinn-Moon Yang
    Cheng-Yan Kao
    Neural Computing & Applications, 2001, 10 : 214 - 230
  • [3] A robust evolutionary algorithm for training neural networks
    Yang, JM
    Kao, CY
    NEURAL COMPUTING & APPLICATIONS, 2001, 10 (03): : 214 - 230
  • [4] A Global Convergence PSO Training Algorithm of Neural Networks
    Li, Ming
    Li, Wei
    Yang, Cheng-wu
    2010 8TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA), 2010, : 3261 - 3265
  • [5] Analysis and Local Convergence Proof of a Constrained Optimization Algorithm for Training Neural Networks
    Kanavalau, Andrei
    Lall, Sanjay
    2024 IEEE CONFERENCE ON CONTROL TECHNOLOGY AND APPLICATIONS, CCTA 2024, 2024, : 839 - 844
  • [6] Analysis of convergence performance of neural networks ranking algorithm
    Zhang, Yongquan
    Cao, Feilong
    NEURAL NETWORKS, 2012, 34 : 65 - 71
  • [7] Model of evolutionary-RBF neural networks and its training algorithm
    Xie, Guang-Jun
    Zhuang, Zhen-Quan
    Li, Hai-Ying
    Xiaoxing Weixing Jisuanji Xitong/Mini-Micro Systems, 2001, 22 (11):
  • [8] Evolutionary algorithms for training neural networks
    Mohan, Chilukuri K.
    MODELING AND SIMULATION FOR MILITARY APPLICATIONS, 2006, 6228
  • [9] Convergence of a Gradient Algorithm with Penalty for Training Two-layer Neural Networks
    Shao, Hongmei
    Liu, Lijun
    Zheng, Gaofeng
    2009 2ND IEEE INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, VOL 4, 2009, : 76 - +
  • [10] Evolutionary Algorithm for Training Compact Single Hidden Layer Feedforward Neural Networks
    Huynh, Hieu Trung
    Won, Yonggwan
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3028 - 3033