The layer-wise method and the backpropagation hybrid approach to learning a feedforward neural network

被引:25
|
作者
Rubanov, NS [1 ]
机构
[1] Belarusian State Univ, Radiophys Dept, Minsk 220050, BELARUS
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2000年 / 11卷 / 02期
关键词
feedforward neural network; generalization error; layer-wise learning method; learning time; second-order methods;
D O I
10.1109/72.839001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feedforward neural networks (FNN's) have been proposed to solve complex problems in pattern recognition and classification and function approximation. Despite the general success of learning methods for FNN's, such as the backpropagation (BP) algorithm, second-order optimization algorithms and layer-wise learning algorithms, several drawbacks remain to be overcome. In particular, two major drawbacks are convergence to a local minima and long learning time. In this paper we propose an efficient learning method for a FNN that combines the BP strategy and optimization layer by layer. More precisely, we construct the layer-wise optimization method using the Taylor series expansion of nonlinear operators describing a FNN and propose to update weights of each layer by the BP-based Kaczmarz iterative procedure. The experimental results show that the new learning algorithm is stable, it reduces the learning time and demonstrates improvement of generalization results in comparison with other well-known methods.
引用
收藏
页码:295 / 305
页数:11
相关论文
共 50 条
  • [1] Layer-wise learning based stochastic gradient descent method for the optimization of deep convolutional neural network
    Zheng, Qinghe
    Tian, Xinyu
    Jiang, Nan
    Yang, Mingqiang
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 37 (04) : 5641 - 5654
  • [2] A Layer-Wise Ensemble Technique for Binary Neural Network
    Xi, Jiazhen
    Yamauchi, Hiroyuki
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2021, 35 (08)
  • [3] Network with Sub-networks: Layer-wise Detachable Neural Network
    Fuengfusin, Ninnart
    Tamukoh, Hakaru
    [J]. JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2021, 7 (04): : 240 - 244
  • [4] eXtreme Federated Learning (XFL): a layer-wise approach
    El Mokadem, Rachid
    Ben Maissa, Yann
    El Akkaoui, Zineb
    [J]. CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (05): : 5741 - 5754
  • [5] Craft Distillation: Layer-wise Convolutional Neural Network Distillation
    Blakeney, Cody
    Li, Xiaomin
    Yan, Yan
    Zong, Ziliang
    [J]. 2020 7TH IEEE INTERNATIONAL CONFERENCE ON CYBER SECURITY AND CLOUD COMPUTING (CSCLOUD 2020)/2020 6TH IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND SCALABLE CLOUD (EDGECOM 2020), 2020, : 252 - 257
  • [6] Filtering-based Layer-wise Parameter Update Method for Training a Neural Network
    Ji, Siyu
    Zhai, Kaikai
    Wen, Chenglin
    [J]. 2018 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES (ICCAIS), 2018, : 389 - 394
  • [7] Collaborative Layer-Wise Discriminative Learning in Deep Neural Networks
    Jin, Xiaojie
    Chen, Yunpeng
    Dong, Jian
    Feng, Jiashi
    Yan, Shuicheng
    [J]. COMPUTER VISION - ECCV 2016, PT VII, 2016, 9911 : 733 - 749
  • [8] Temperature Balancing, Layer-wise Weight Analysis, and Neural Network Training
    Zhou, Yefan
    Pang, Tianyu
    Liu, Keqin
    Martin, Charles H.
    Mahoney, Michael W.
    Yang, Yaoqing
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] A Layer-Wise Theoretical Framework for Deep Learning of Convolutional Neural Networks
    Huu-Thiet Nguyen
    Li, Sitan
    Cheah, Chien Chern
    [J]. IEEE ACCESS, 2022, 10 : 14270 - 14287
  • [10] Learning Feature Hierarchies: A Layer-Wise Tag-Embedded Approach
    Yuan, Zhaoquan
    Xu, Changsheng
    Sang, Jitao
    Yan, Shuicheng
    Hossain, M. Shamim
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (06) : 816 - 827