Dynamic Changes of Population Size in Training of Artificial Neural Networks

被引:0
|
作者
Slowik, A. [1 ]
Bialko, M. [1 ]
机构
[1] Koszalin Univ Technol, Dept Elect & Comp Sci, Koszalin, Poland
来源
HUMAN-COMPUTER SYSTEMS INTERACTION: BACKGROUNDS AND APPLICATIONS | 2009年 / 60卷
关键词
DIFFERENTIAL EVOLUTION ALGORITHM; GLOBAL OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper an adaptive differential evolution algorithm with dynamic changes of population size is presented. In proposed algorithm an adaptive selection of control parameters of the algorithm are introduced. Due to these parameters selection, the algorithm gives better results than differential evolution algorithm without this modification. Also, in presented algorithm dynamic changes of population size are introduced. This modification try to overcome limitations connected with premature convergence of the algorithm. Due to dynamic changes of population size, the algorithm can easier get out from local minimum. The proposed algorithm is used to train artificial neural networks. Results obtained are compared with those obtained using: adaptive differential evolution algorithm without dynamic changes of population size, method based on evolutionary algorithm, error back-propagation algorithm, and Levenberg-Marquardt algorithm.
引用
收藏
页码:517 / 527
页数:11
相关论文
共 50 条
  • [1] The influence of relative sample size in training artificial neural networks
    Blamire, PA
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 1996, 17 (01) : 223 - 230
  • [2] Dynamic Sampling in Training Artificial Neural Networks with Overlapping Swarm Intelligence
    Qureshi, Shehzad
    Sheppard, John W.
    2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 440 - 446
  • [3] Training Optimization for Artificial Neural Networks
    Toribio Luna, Primitivo
    Alejo Eleuterio, Roberto
    Valdovinos Rosas, Rosa Maria
    Rodriguez Mendez, Benjamin Gonzalo
    CIENCIA ERGO-SUM, 2010, 17 (03) : 313 - 317
  • [4] Learning with Dynamic Architectures for Artificial Neural Networks - Adaptive Batch Size Approach
    Saeed, Reham
    Ghnemat, Rawan
    Benbrahim, Ghassen
    Elhassan, Ammar
    2019 2ND INTERNATIONAL CONFERENCE ON NEW TRENDS IN COMPUTING SCIENCES (ICTCS), 2019, : 302 - 305
  • [5] Adapting the Size of Artificial Neural Networks Using Dynamic Auto-Sizing
    Cahlik, Vojtech
    Kordik, Pavel
    Cepek, Miroslav
    2022 IEEE 17TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND INFORMATION TECHNOLOGIES (CSIT), 2022, : 592 - 596
  • [6] A dynamic architecture for artificial neural networks
    Ghiassi, M
    Saidane, H
    NEUROCOMPUTING, 2005, 63 : 397 - 413
  • [7] On Speaker Adaptive Training of Artificial Neural Networks
    Trmal, Jan
    Zelinka, Jan
    Mueller, Ludek
    11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 1-2, 2010, : 554 - 557
  • [8] Artificial Bee Colony Training of Neural Networks
    Bullinaria, John A.
    AlYahya, Khulood
    NATURE INSPIRED COOPERATIVE STRATEGIES FOR OPTIMIZATION (NICSO 2013), 2014, 512 : 191 - 201
  • [9] A STOCHASTIC TRAINING ALGORITHM FOR ARTIFICIAL NEURAL NETWORKS
    BARTLETT, EB
    NEUROCOMPUTING, 1994, 6 (01) : 31 - 43
  • [10] Sparse solution in training artificial neural networks
    Giustolisi, O
    NEUROCOMPUTING, 2004, 56 : 285 - 304