An Extreme Learning Machine Based Pretraining Method for Multi-Layer Neural Networks

被引:0
|
作者
Noinongyao, Pavit [1 ]
Watchareeruetai, Ukrit [1 ]
机构
[1] King Mongkuts Inst Technol Ladkrabang, Int Coll, Chalongkrung Rd, Bangkok 10520, Thailand
关键词
extreme learning machine; pretraining; autoencoder; backward extreme learning machine; REPRESENTATIONS; ALGORITHM;
D O I
10.1109/SCIS-ISIS.2018.00105
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One approach in training a deep neural network to perform effectively is to do unsupervised pretraining on each layer, followed by fine-tuning the whole network. A common way is to train an unsupervised model of neural network such as restricted Boltzmann machines or autoencoders and stack them on top of another. Although these unsupervised pretraining approaches yield good performance, relying on back-propagation, due to iterative learning process, they still suffer from a long pretraining time. Extreme learning machine (ELM) is an analytical training approach which is extremely fast and gives a solution with a good generalization performance. In this paper, we apply a new ELM based unsupervised learning, named backward ELM based autoencoder (BELM-AE), to pretrain each layer of a neural network before using a back-propagation based learning algorithm to fine-tune the whole network. Experimental results show that the new pretraining method requires significantly shorter training time and also yields better testing performance on various datasets.
引用
收藏
页码:608 / 613
页数:6
相关论文
共 50 条
  • [31] A new fast learning algorithm for multi-layer feedforward neural networks
    Zhang, De-Xian
    Liu, Can
    Wang, Zi-Qiang
    Liu, Nan-Bo
    [J]. PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2006, : 2928 - +
  • [32] Ensemble learning for multi-layer networks
    Barber, D
    Bishop, CM
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 10, 1998, 10 : 395 - 401
  • [33] MULTI-LAYER NEURAL NETWORK LEARNING ALGORITHM BASED ON RANDOM PATTERN SEARCH METHOD
    Gao, Shangce
    Zhang, Jianchen
    Wang, Xugang
    Tang, Zheng
    [J]. INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2009, 5 (02): : 489 - 502
  • [34] The layer effect on multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    [J]. APPLIED MATHEMATICS LETTERS, 2013, 26 (07) : 706 - 709
  • [35] A Multi-layer Machine Learning-based Intrusion Detection System for Wireless Sensor Networks
    Alruhaily, Nada M.
    Ibrahim, Dina M.
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (04) : 281 - 288
  • [36] Incremental Extreme Learning Machine based on Cascade Neural Networks
    Wan, Yihe
    Song, Shiji
    Huang, Gao
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2015): BIG DATA ANALYTICS FOR HUMAN-CENTRIC SYSTEMS, 2015, : 1889 - 1894
  • [37] Robust multi-layer extreme learning machine using bias-variance tradeoff
    Yu, Tian-jun
    Yan, Xue-feng
    [J]. JOURNAL OF CENTRAL SOUTH UNIVERSITY, 2020, 27 (12) : 3744 - 3753
  • [38] Fractional Stochastic Gradient Descent Based Learning Algorithm For Multi-layer Perceptron Neural Networks
    Sadiq, Alishba
    Yahya, Norashikin
    [J]. 2020 8TH INTERNATIONAL CONFERENCE ON INTELLIGENT AND ADVANCED SYSTEMS (ICIAS), 2021,
  • [39] On the structure of multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    Lin, Song-Sun
    [J]. JOURNAL OF DIFFERENTIAL EQUATIONS, 2012, 252 (08) : 4563 - 4597
  • [40] Diamond in multi-layer cellular neural networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    [J]. APPLIED MATHEMATICS AND COMPUTATION, 2013, 222 : 1 - 12