Optimization of back-propagation network using simulated annealing approach

被引:6
|
作者
Chen, S. -C. [1 ]
Lin, S. -W. [2 ]
Tseng, T. -Y. [2 ]
Lin, H. -C. [2 ]
机构
[1] Natl Taiwan Univ Sci & Technol, Taipei, Taiwan
[2] Huafan Univ, Dept Informat Management, New Taipei, Taiwan
关键词
D O I
10.1109/ICSMC.2006.385301
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The back-propagation network (BPN) is a popular data mining technique. Nevertheless, different problems may require different network architectures and parameters. Therefore, rule of thumb or "try and error" methods are usually used to determine them. However, these methods may lead worse network architectures and parameters. A dataset may contain many features; however, not all features are beneficial for classification in BPN. Therefore, a simulated annealing (SA) approach is proposed to select the beneficial subset of features and to obtain the better network architectures and parameters which result in a better classification. In order to verify the developed approach, three dataset, namely PIMA, IONOS, and CANCER from UCI (University of California, Irvine) machine learning database, are employed for evaluation, and the 10-fold cross-validation is applied to calculate the classification result. Compared with the MONNA (Multiple ordinate neural network architecture) structure developed by Leazoray and Cardot, the classification accurate rates of the developed approach are superior to those of the MONNA. When the feature selection is taken into consideration the classification accurate rates of three dataset are increased. Therefore, the developed approach can be utilized to find out the network architecture and parameters of BPN, and discover the useful attributes effectively.
引用
收藏
页码:2819 / +
页数:2
相关论文
共 50 条
  • [1] A simulated-annealing-based approach for simultaneous parameter optimization and feature selection of back-propagation networks
    Lin, Shih-Wei
    Tseng, Tsung-Yuan
    Chou, Shuo-Yan
    Chen, Shih-Chieh
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2008, 34 (02) : 1491 - 1499
  • [2] An interval uncertain optimization method using back-propagation neural network differentiation
    Wang, Liqun
    Chen, Zengtao
    Yang, Guolai
    Sun, Qinqin
    Ge, Jianli
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2020, 366
  • [3] A user authentication system using back-propagation network
    Iuon-Chang Lin
    Hsia-Hung Ou
    Min-Shiang Hwang
    [J]. Neural Computing & Applications, 2005, 14 : 243 - 249
  • [4] A user authentication system using back-propagation network
    Lin, IC
    Ou, HH
    Hwang, MS
    [J]. NEURAL COMPUTING & APPLICATIONS, 2005, 14 (03): : 243 - 249
  • [5] Training back-propagation neural network using hybrid fruit fly optimization algorithm
    [J]. Cai, Fei, 1600, American Scientific Publishers (13):
  • [6] A study on optimal approach in the back-propagation neural network architecture
    Zhang, YZ
    Wang, DP
    Feng, ZS
    [J]. ICEMI'99: FOURTH INTERNATIONAL CONFERENCE ON ELECTRONIC MEASUREMENT & INSTRUMENTS, VOLS 1 AND 2, CONFERENCE PROCEEDINGS, 1999, : 60 - 64
  • [7] Fuzzy back-propagation network approach for estimating the simulation workload
    Tin-Chih Toly Chen
    [J]. Neural Computing and Applications, 2016, 27 : 1707 - 1715
  • [8] Fuzzy back-propagation network approach for estimating the simulation workload
    Chen, Tin-Chih Toly
    [J]. NEURAL COMPUTING & APPLICATIONS, 2016, 27 (06): : 1707 - 1715
  • [9] Lens design optimization by back-propagation
    Wang, Congli
    Chen, Ni
    Heidrich, Wolfgang
    [J]. INTERNATIONAL OPTICAL DESIGN CONFERENCE 2021, 2021, 12078
  • [10] Truncated Back-propagation for Bilevel Optimization
    Shaban, Amirreza
    Cheng, Ching-An
    Hatch, Nathan
    Boots, Byron
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89