An optimization methodology for neural network weights and architectures

被引:101
|
作者
Ludermir, Teresa B. [1 ]
Yamazaki, Akio [1 ]
Zanchettin, Cleber [1 ]
机构
[1] Univ Fed Pernambuco, Ctr Informat, BR-50740540 Recife, Brazil
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2006年 / 17卷 / 06期
关键词
multilayer perceptron (MLP); optimization of weights and architectures; simulating annealing; tabu search;
D O I
10.1109/TNN.2006.881047
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces a methodology for neural network global optimization. The aim is the simultaneous optimization of multilayer perceptron (MLP) network weights and architectures, in order to generate topologies with few connections and high classification performance for any data sets. The approach combines the advantages of simulated annealing, tabu search and the backpropagation training algorithm in order to generate an automatic process for producing networks with high classification performance and low complexity. Experimental results obtained with four classification problems and one prediction problem has shown to be better than those obtained by the most commonly used optimization techniques.
引用
收藏
页码:1452 / 1459
页数:8
相关论文
共 50 条
  • [1] Hybrid optimization algorithm for the definition of MLP neural network architectures and weights
    Lins, APS
    Ludermir, TB
    [J]. HIS 2005: 5th International Conference on Hybrid Intelligent Systems, Proceedings, 2005, : 149 - 154
  • [2] Optimization of neural network weights and architectures for odor recognition using simulated annealing
    Yamazaki, A
    de Souto, MCP
    Ludermir, TB
    [J]. PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 547 - 552
  • [3] Frankenstein PSO Applied to Neural Network Weights and Architectures
    de Lima, Natalia Flora
    Ludermir, Teresa Bernarda
    [J]. 2011 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2011, : 2452 - 2456
  • [4] Global optimization of neural network weights
    Hamm, L
    Brorsen, BW
    Hagan, MT
    [J]. PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1228 - 1233
  • [5] Optimization of Neural Networks Weights and Architecture: A multimodal methodology
    Zarth, Antonio Miguel F.
    Ludermir, Teresa B.
    [J]. 2009 9TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, 2009, : 209 - 214
  • [6] Genetic optimization of ART neural network Architectures
    Kaylani, A.
    Al-Daraiseh, A.
    Georgiopoulos, M.
    Mollaghasemi, M.
    Anagnostopoulos, G. C.
    Wu, A. S.
    [J]. 2007 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-6, 2007, : 379 - +
  • [7] A scalable algorithm for the optimization of neural network architectures
    Pasini, Massimiliano Lupo
    Yin, Junqi
    Li, Ying Wai
    Eisenbach, Markus
    [J]. PARALLEL COMPUTING, 2021, 104
  • [8] Genetic optimization of art neural network architectures
    Kaylani, Assem
    Georgiopoulos, Michael
    Mollaghasemi, Mansooreh
    Anagnostopoulos, Georgios
    [J]. PROCEDINGS OF THE 11TH IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, 2007, : 225 - 230
  • [9] Simultaneous optimization of weights and structure of an RBF neural network
    Lefort, Virginie
    Knibbe, Carole
    Beslon, Guillaume
    Favrel, Joel
    [J]. ARTIFICIAL EVOLUTION, 2006, 3871 : 49 - 60
  • [10] Particle Swarm Optimization to Obtain Weights in Neural Network
    Warsito, Budi
    Yasin, Hasbi
    Prahutama, Alan
    [J]. MATEMATIKA, 2019, 35 (03) : 345 - 355