Potential Layer-Wise Supervised Learning for Training Multi-Layered Neural Networks

被引:0
|
作者
Kamimura, Ryotaro [1 ,2 ]
机构
[1] Tokai Univ, IT Educ Ctr, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
[2] Tokai Univ, Grad Sch Sci & Technol, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The present paper tries to show that the greedy layer-wise supervised learning becomes effective enough to improve generalization and interpretation by the help of potential learning. It has been observed that unsupervised pre-training has a shortcoming of vanishing information as is the case of simple multi-layered network training. When the layer becomes higher, valuable information becomes smaller. Information through many different layers tends to diminish considerably and naturally from the information-theoretic point of view. For this, we use the layer-wise supervised training to prevent information from diminishing. The supervised learning has been said to be not good for pre-training for multi-layered neural networks. However, we have found that the new potential learning can be effectively used to extract valuable information through supervised pre-training. With the help of important components extracted by the potential learning, the supervised pre-training becomes effective for training multi-layered neural networks. We applied the method to two data sets, namely, an artificial and banknote data sets. In both cases, the potential learning proved to be effective in increasing generalization performance. In addition, we could show a possibility that final representation by this method could be clearly understood.
引用
收藏
页码:2568 / 2575
页数:8
相关论文
共 50 条
  • [21] Learning characteristics of multi-layered higher order neural networks
    Ma, L.
    Miyajima, H.
    Shigei, N.
    Advances in Computational Methods in Sciences and Engineering 2005, Vols 4 A & 4 B, 2005, 4A-4B : 361 - 364
  • [22] Local Selective Learning for Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    Kitajima, Ryozo
    Sakai, Hiroyuki
    2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 115 - 122
  • [23] Forward layer-wise learning of convolutional neural networks through separation index maximizing
    Karimi, Ali
    Kalhor, Ahmad
    Tabrizi, Melika Sadeghi
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [24] Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain Surgeon
    Dong, Xin
    Chen, Shangyu
    Pan, Sinno Jialin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [25] REINFORCEMENT LEARNING-BASED LAYER-WISE QUANTIZATION FOR LIGHTWEIGHT DEEP NEURAL NETWORKS
    Jung, Juri
    Kim, Jonghee
    Kim, Youngeun
    Kim, Changick
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 3070 - 3074
  • [26] A Comparison of Multi-Layered Ultrasonic Joints Fabrication using Layer-Wise vs Stacked Approaches for Battery Interconnects
    Sharma, Vijay
    Das, Abhishek
    MANUFACTURING LETTERS, 2025, 43 : 75 - 78
  • [27] Unsupervised Layer-Wise Model Selection in Deep Neural Networks
    Ludovic, Arnold
    Helene, Paugam-Moisy
    Michele, Sebag
    ECAI 2010 - 19TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2010, 215 : 915 - 920
  • [28] Stochastic Neural Networks with Layer-Wise Adjustable Sequence Length
    Wang, Ziheng
    Reviriego, Pedro
    Niknia, Farzad
    Liu, Shanshan
    Gao, Zhen
    Lombardi, Fabrizio
    2024 IEEE 24TH INTERNATIONAL CONFERENCE ON NANOTECHNOLOGY, NANO 2024, 2024, : 436 - 441
  • [29] Multithreaded Layer-wise Training of Sparse Deep Neural Networks using Compressed Sparse Column
    Mofrad, Mohammad Hasanzadeh
    Melhem, Rami
    Ahmad, Yousuf
    Hammoud, Mohammad
    2019 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2019,
  • [30] Temperature Balancing, Layer-wise Weight Analysis, and Neural Network Training
    Zhou, Yefan
    Pang, Tianyu
    Liu, Keqin
    Martin, Charles H.
    Mahoney, Michael W.
    Yang, Yaoqing
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,