Potential Layer-Wise Supervised Learning for Training Multi-Layered Neural Networks

被引:0
|
作者
Kamimura, Ryotaro [1 ,2 ]
机构
[1] Tokai Univ, IT Educ Ctr, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
[2] Tokai Univ, Grad Sch Sci & Technol, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The present paper tries to show that the greedy layer-wise supervised learning becomes effective enough to improve generalization and interpretation by the help of potential learning. It has been observed that unsupervised pre-training has a shortcoming of vanishing information as is the case of simple multi-layered network training. When the layer becomes higher, valuable information becomes smaller. Information through many different layers tends to diminish considerably and naturally from the information-theoretic point of view. For this, we use the layer-wise supervised training to prevent information from diminishing. The supervised learning has been said to be not good for pre-training for multi-layered neural networks. However, we have found that the new potential learning can be effectively used to extract valuable information through supervised pre-training. With the help of important components extracted by the potential learning, the supervised pre-training becomes effective for training multi-layered neural networks. We applied the method to two data sets, namely, an artificial and banknote data sets. In both cases, the potential learning proved to be effective in increasing generalization performance. In addition, we could show a possibility that final representation by this method could be clearly understood.
引用
收藏
页码:2568 / 2575
页数:8
相关论文
共 50 条
  • [31] DEEP RECURRENT NEURAL NETWORKS WITH LAYER-WISE MULTI-HEAD ATTENTIONS FOR PUNCTUATION RESTORATION
    Kim, Seokhwan
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7280 - 7284
  • [32] Connective Potential Information for Collectively Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 3033 - 3042
  • [33] GREEDY LAYER-WISE TRAINING OF LONG SHORT TERM MEMORY NETWORKS
    Xu, Kaisheng
    Shen, Xu
    Yao, Ting
    Tian, Xinmei
    Mei, Tao
    2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW 2018), 2018,
  • [34] Training multi-layered neural network neocognitron
    Fukushima, Kunihiko
    NEURAL NETWORKS, 2013, 40 : 18 - 31
  • [35] Effects of depth, width, and initialization: A convergence analysis of layer-wise training for deep linear neural networks
    Shin, Yeonjong
    ANALYSIS AND APPLICATIONS, 2022, 20 (01) : 73 - 119
  • [36] On the Semi-Supervised Learning of Multi-Layered Perceptrons
    Malkin, Jonathan
    Subramanya, Amarnag
    Bilmes, Jeff
    INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5, 2009, : 680 - 683
  • [37] Interpreting Convolutional Neural Networks via Layer-Wise Relevance Propagation
    Jia, Wohuan
    Zhang, Shaoshuai
    Jiang, Yue
    Xu, Li
    ARTIFICIAL INTELLIGENCE AND SECURITY, ICAIS 2022, PT I, 2022, 13338 : 457 - 467
  • [38] Layer-Wise Optimization of Contextual Neural Networks with Dynamic Field of Aggregation
    Jodlowiec, Marcin
    Albu, Adriana
    Wolk, Krzysztof
    Nguyen Thai-Nghe
    Karasinski, Adrian
    INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, PT II, 2022, 13758 : 302 - 312
  • [39] Explaining Therapy Predictions with Layer-wise Relevance Propagation in Neural Networks
    Yang, Yinchong
    Tresp, Volker
    Wunderle, Marius
    Fasching, Peter A.
    2018 IEEE INTERNATIONAL CONFERENCE ON HEALTHCARE INFORMATICS (ICHI), 2018, : 152 - 162
  • [40] Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
    Liebenuein, Lucas
    Maalouf, Alaa
    Gal, Oren
    Feldman, Dan
    Rus, Daniela
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34