Neural self-compressor: Collective interpretation by compressing multi-layered neural networks into non-layered networks

被引:14
|
作者
Kamimura, Ryotaro [1 ]
机构
[1] Tokai Univ, IT Educ Ctr, 4-1-1 Kitakaname, Hiratsuka, Kanagawa 2591292, Japan
基金
日本学术振兴会;
关键词
Model compression; self-compression; collective interpretation; mutual informaton; multi-layered neural networks; INFORMATION MAXIMIZATION; MUTUAL INFORMATION; RULE EXTRACTION;
D O I
10.1016/j.neucom.2018.09.036
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The present paper proposes a new method called "neural self-compressors" to compress multi-layered neural networks into the simplest possible ones (i.e., without hidden layers) to aid in the interpretation of relations between inputs and outputs. Though neural networks have shown great success in improving generalization, the interpretation of internal representations becomes a serious problem as the number of hidden layers and their corresponding connection weights becomes larger and larger. To overcome this interpretation problem, we introduce a method that compresses multi-layered neural networks into ones without hidden layers. In addition, this method simplifies entangled weights as much as possible by maximizing mutual information between inputs and outputs. In this way, final connection weights can be interpreted as easily as by the logistic regression analysis. The method was applied to four data sets: a symmetric data set, ovarian cancer data set, restaurant data set, and credit card holders' default data set. In the first set, the symmetric data set, we tried to explain how the present method could produce interpretable outputs intuitively. In all the other cases, we succeeded in compressing multi-layered neural networks into their simplest forms with the help of mutual information maximization. In addition, by de-correlating outputs, we were able to transform connection weights from those close to the regression coefficients to ones with more explicit features. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:12 / 36
页数:25
相关论文
共 50 条
  • [21] Mutual Information Maximization for Improving and Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017,
  • [22] Supervised Semi-Autoencoder Learning for Multi-Layered Neural Networks
    Kamimura, Ryotaro
    Takeuchi, Haruhiko
    2017 JOINT 17TH WORLD CONGRESS OF INTERNATIONAL FUZZY SYSTEMS ASSOCIATION AND 9TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (IFSA-SCIS), 2017,
  • [23] Frequency Estimation from Waveforms using Multi-Layered Neural Networks
    Verma, Prateek
    Schafer, Ronald W.
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 2165 - 2169
  • [24] Automatic classification of volcanic earthquakes by using multi-layered neural networks
    Falsaperla, S
    Graziani, S
    Nunnari, G
    Spampinato, S
    NATURAL HAZARDS, 1996, 13 (03) : 205 - 228
  • [25] Connective Potential Information for Collectively Interpreting Multi-Layered Neural Networks
    Kamimura, Ryotaro
    2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 3033 - 3042
  • [26] Artificial vision by multi-layered neural networks: Neocognitron and its advances
    Fukushima, Kunihiko
    Neural Networks, 2013, 37 : 103 - 119
  • [27] Active noise control using multi-layered perceptron neural networks
    Tokhi, MO
    Wood, R
    JOURNAL OF LOW FREQUENCY NOISE VIBRATION AND ACTIVE CONTROL, 1997, 16 (02): : 109 - 144
  • [28] Selective Information Control and Layer-Wise Partial Collective Compression for Multi-Layered Neural Networks
    Kamimura, Ryotaro
    INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, ISDA 2021, 2022, 418 : 121 - 131
  • [29] LAYERED NEURAL NETWORKS
    DOMANY, E
    KINZEL, W
    MEIR, R
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1989, 22 (12): : 2081 - 2102
  • [30] New Algebraic Activation Function for Multi-Layered Feed Forward Neural Networks
    Babu, K. V. Naresh
    Edla, Damodar Reddy
    IETE JOURNAL OF RESEARCH, 2017, 63 (01) : 71 - 79