Repeated Potentiality Augmentation for Multi-layered Neural Networks

被引:0
|
作者
Kamimura, Ryotaro [1 ,2 ]
机构
[1] Tokai Univ, 2880 Kamimatsuo Nishi Ku, Kumamoto 8615289, Japan
[2] Kumamoto Drone Technol & Dev Fdn, 2880 Kamimatsuo Nishi Ku, Kumamoto 8615289, Japan
来源
ADVANCES IN INFORMATION AND COMMUNICATION, FICC, VOL 2 | 2023年 / 652卷
关键词
Equi-potentiality; Total potentiality; Relative potentiality; Collective interpretation; Partial interpretation; MUTUAL INFORMATION; LEARNING-MODELS; CLASSIFICATION; MAXIMIZE; INPUT; MAPS;
D O I
10.1007/978-3-031-28073-3_9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The present paper proposes a new method to augment the potentiality of components in neural networks. The basic hypothesis is that all components should have equal potentiality (equi-potentiality) to be used for learning. This equi-potentiality of components has implicitly played critical roles in improving multi-layered neural networks. We introduce here the total potentiality and relative potentiality for each hidden layer, and we try to force networks to increase the potentiality as much as possible to realize the equi-potentiality. In addition, the potentiality augmentation is repeated at any time the potentiality tends to decrease, which is used to increase the chance for any components to be used as equally as possible. We applied the method to the bankruptcy data set. By keeping the equi-potentiality of components by repeating the process of potentiality augmentation and reduction, we could see improved generalization. Then, by considering all possible representations by the repeated potentiality augmentation, we can interpret which inputs can contribute to the final performance of networks.
引用
收藏
页码:117 / 134
页数:18
相关论文
共 50 条
  • [21] Mutual Information Maximization for Improving and Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017,
  • [22] Supervised Semi-Autoencoder Learning for Multi-Layered Neural Networks
    Kamimura, Ryotaro
    Takeuchi, Haruhiko
    2017 JOINT 17TH WORLD CONGRESS OF INTERNATIONAL FUZZY SYSTEMS ASSOCIATION AND 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (IFSA-SCIS), 2017,
  • [23] Frequency Estimation from Waveforms using Multi-Layered Neural Networks
    Verma, Prateek
    Schafer, Ronald W.
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 2165 - 2169
  • [24] Automatic classification of volcanic earthquakes by using multi-layered neural networks
    Falsaperla, S
    Graziani, S
    Nunnari, G
    Spampinato, S
    NATURAL HAZARDS, 1996, 13 (03) : 205 - 228
  • [25] Connective Potential Information for Collectively Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 3033 - 3042
  • [26] Artificial vision by multi-layered neural networks: Neocognitron and its advances
    Fukushima, Kunihiko
    Neural Networks, 2013, 37 : 103 - 119
  • [27] Active noise control using multi-layered perceptron neural networks
    Tokhi, MO
    Wood, R
    JOURNAL OF LOW FREQUENCY NOISE VIBRATION AND ACTIVE CONTROL, 1997, 16 (02): : 109 - 144
  • [28] Evaluation of multi-layered RBF networks
    Hirasawa, K
    Matsuoka, T
    Ohbayashi, M
    Murata, J
    SMC '97 CONFERENCE PROCEEDINGS - 1997 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5: CONFERENCE THEME: COMPUTATIONAL CYBERNETICS AND SIMULATION, 1997, : 908 - 911
  • [29] On Modeling and Analyzing Multi-Layered Networks
    Kennedy, Kevin T.
    Deckro, Richard F.
    Chrissis, James W.
    Wiley, Victor D.
    MILITARY OPERATIONS RESEARCH, 2009, 14 (03) : 53 - 66
  • [30] New Algebraic Activation Function for Multi-Layered Feed Forward Neural Networks
    Babu, K. V. Naresh
    Edla, Damodar Reddy
    IETE JOURNAL OF RESEARCH, 2017, 63 (01) : 71 - 79