Optimization of Neural Networks Weights and Architecture: A multimodal methodology

被引:4
|
作者
Zarth, Antonio Miguel F. [1 ]
Ludermir, Teresa B. [1 ]
机构
[1] Univ Fed Pernambuco, Ctr Informat, BR-50740540 Recife, PE, Brazil
关键词
D O I
10.1109/ISDA.2009.90
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes a multimodal methodology for evolutionary optimization of neural networks. In this approach, we use Differential Evolution with parallel subpopulations to simultaneously train a neural network and find an efficient architecture. The results in three classification problems have shown that the neural network resulting from this method has low complexity and high capability of generalization when compared with other methods found in literature. Furthermore, two regularization techniques, weight decay and weight elimination, are investigated and results are presented.
引用
收藏
页码:209 / 214
页数:6
相关论文
共 50 条
  • [1] An optimization methodology for neural network weights and architectures
    Ludermir, Teresa B.
    Yamazaki, Akio
    Zanchettin, Cleber
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (06): : 1452 - 1459
  • [2] A methodology to train and improve artificial neural networks' weights and connections
    Zanchettin, Cleber
    Ludermir, Teresa B.
    [J]. 2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 5267 - 5274
  • [3] The weights initialization methodology of unsupervised neural networks to improve clustering stability
    Park, Seongchul
    Seo, Sanghyun
    Jeong, Changhoon
    Kim, Juntae
    [J]. JOURNAL OF SUPERCOMPUTING, 2020, 76 (08): : 6421 - 6437
  • [4] The weights initialization methodology of unsupervised neural networks to improve clustering stability
    Seongchul Park
    Sanghyun Seo
    Changhoon Jeong
    Juntae Kim
    [J]. The Journal of Supercomputing, 2020, 76 : 6421 - 6437
  • [5] Joslim: Joint Widths and Weights Optimization for Slimmable Neural Networks
    Chin, Ting-Wu
    Morcos, Ari S.
    Marculescu, Diana
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 119 - 134
  • [6] Improved whale optimization algorithm and its weights and thresholds optimization in shallow neural architecture search
    Liu W.
    Guo Z.-Q.
    Wang D.
    Liu G.-W.
    Jiang F.
    Niu Y.-J.
    Ma L.-X.
    [J]. Kongzhi yu Juece/Control and Decision, 2023, 38 (04): : 1144 - 1152
  • [7] Optimizing connection weights in neural networks using the whale optimization algorithm
    Aljarah, Ibrahim
    Faris, Hossam
    Mirjalili, Seyedali
    [J]. SOFT COMPUTING, 2018, 22 (01) : 1 - 15
  • [8] Optimizing connection weights in neural networks using the whale optimization algorithm
    Ibrahim Aljarah
    Hossam Faris
    Seyedali Mirjalili
    [J]. Soft Computing, 2018, 22 : 1 - 15
  • [9] Processor Architecture Optimization for Spatially Dynamic Neural Networks
    Colleman, Steven
    Verelst, Thomas
    Mei, Linyan
    Tuytelaars, Tinne
    Verhelst, Marian
    [J]. PROCEEDINGS OF THE 2021 IFIP/IEEE INTERNATIONAL CONFERENCE ON VERY LARGE SCALE INTEGRATION (VLSI-SOC), 2021, : 24 - 29
  • [10] On neural networks with minimal weights
    Bohossian, V
    Bruck, J
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 246 - 252