Potential Layer-Wise Supervised Learning for Training Multi-Layered Neural Networks

被引:0
|
作者
Kamimura, Ryotaro [1 ,2 ]
机构
[1] Tokai Univ, IT Educ Ctr, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
[2] Tokai Univ, Grad Sch Sci & Technol, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The present paper tries to show that the greedy layer-wise supervised learning becomes effective enough to improve generalization and interpretation by the help of potential learning. It has been observed that unsupervised pre-training has a shortcoming of vanishing information as is the case of simple multi-layered network training. When the layer becomes higher, valuable information becomes smaller. Information through many different layers tends to diminish considerably and naturally from the information-theoretic point of view. For this, we use the layer-wise supervised training to prevent information from diminishing. The supervised learning has been said to be not good for pre-training for multi-layered neural networks. However, we have found that the new potential learning can be effectively used to extract valuable information through supervised pre-training. With the help of important components extracted by the potential learning, the supervised pre-training becomes effective for training multi-layered neural networks. We applied the method to two data sets, namely, an artificial and banknote data sets. In both cases, the potential learning proved to be effective in increasing generalization performance. In addition, we could show a possibility that final representation by this method could be clearly understood.
引用
收藏
页码:2568 / 2575
页数:8
相关论文
共 50 条
  • [41] A layer-wise Perturbation based Privacy Preserving Deep Neural Networks
    Adesuyi, Tosin A.
    Kim, Byeong Man
    2019 1ST INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE IN INFORMATION AND COMMUNICATION (ICAIIC 2019), 2019, : 389 - 394
  • [42] LAYER-WISE INTERPRETATION OF DEEP NEURAL NETWORKS USING IDENTITY INITIALIZATION
    Kubota, Shohei
    Hayashi, Hideaki
    Hayase, Tomohiro
    Uchida, Seiichi
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3945 - 3949
  • [43] Network with Sub-networks: Layer-wise Detachable Neural Network
    Fuengfusin, Ninnart
    Tamukoh, Hakaru
    JOURNAL OF ROBOTICS NETWORKING AND ARTIFICIAL LIFE, 2021, 7 (04): : 240 - 244
  • [44] Layer-Wise Relevance Propagation for Neural Networks with Local Renormalization Layers
    Binder, Alexander
    Montavon, Gregoire
    Lapuschkin, Sebastian
    Mueller, Klaus-Robert
    Samek, Wojciech
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT II, 2016, 9887 : 63 - 71
  • [45] Deep Convolutional Neural Networks with Layer-wise Context Expansion and Attention
    Yu, Dong
    Xiong, Wayne
    Droppo, Jasha
    Stolcke, Andreas
    Ye, Guoli
    Li, Jinyu
    Zweig, Geoffrey
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 17 - 21
  • [46] Teacher-directed learning: information-theoretic competitive learning in supervised multi-layered networks
    Kamimura, R
    Yoshida, F
    CONNECTION SCIENCE, 2003, 15 (2-3) : 117 - 140
  • [47] Layer-Wise Training Convolutional Neural Networks With Smaller Filters for Human Activity Recognition Using Wearable Sensors
    Tang, Yin
    Teng, Qi
    Zhang, Lei
    Min, Fuhong
    He, Jun
    IEEE SENSORS JOURNAL, 2021, 21 (01) : 581 - 592
  • [48] Evaluating Layer-wise Relevance Propagation Explainability Maps for Artificial Neural Networks
    Ranguelova, Elena
    Pauwels, Eric J.
    Berkhout, Joost
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON E-SCIENCE (E-SCIENCE 2018), 2018, : 377 - 378
  • [49] Optimizing the Deep Neural Networks by Layer-Wise Refined Pruning and the Acceleration on FPGA
    Li, Hengyi
    Yue, Xuebin
    Wang, Zhichen
    Chai, Zhilei
    Wang, Wenwen
    Tomiyama, Hiroyuki
    Meng, Lin
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [50] Layer-wise synapse optimization for implementing neural networks on general neuromorphic architectures
    Mern, John
    Gupta, Jayesh K.
    Kochenderfer, Mykel J.
    2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017, : 3314 - 3321