ON NETWORK SCIENCE AND MUTUAL INFORMATION FOR EXPLAINING DEEP NEURAL NETWORKS

被引:0
|
作者
Davis, Brian [1 ]
Bhatt, Umang [1 ,2 ]
Bhardwaj, Kartikeya [1 ,3 ]
Marculescu, Radu [1 ,4 ]
Moura, Jose M. P. [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[2] Univ Cambridge, Cambridge, England
[3] Arm Inc, Cambridge, England
[4] Univ Texas Austin, Austin, TX 78712 USA
关键词
deep learning; information theory; network science; interpretability;
D O I
10.1109/icassp40776.2020.9053078
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we present a new approach to interpret deep learning models. By coupling mutual information with network science, we explore how information flows through feedforward networks. We show that efficiently approximating mutual information allows us to create an information measure that quantifies how much information flows between any two neurons of a deep learning model. To that end, we propose NIF, Neural Information Flow, a technique for codifying information flow that exposes deep learning model internals and provides feature attributions.
引用
收藏
页码:8399 / 8403
页数:5
相关论文
共 50 条
  • [31] Deep Neural Networks for Network Routing
    Reis, Joao
    Rocha, Miguel
    Truong Khoa Phan
    Griffin, David
    Lea, Franck
    Rio, Miguel
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [32] Pruning by explaining: A novel criterion for deep neural network pruning
    Yeom, Seul-Ki
    Seegerer, Philipp
    Lapuschkin, Sebastian
    Binder, Alexander
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    [J]. PATTERN RECOGNITION, 2021, 115
  • [33] Explaining Deep Neural Network Models with Adversarial Gradient Integration
    Pan, Deng
    Li, Xin
    Zhu, Dongxiao
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2876 - 2883
  • [34] Mutual Information Based Learning Rate Decay for Stochastic Gradient Descent Training of Deep Neural Networks
    Vasudevan, Shrihari
    [J]. ENTROPY, 2020, 22 (05)
  • [35] Perception Science in the Age of Deep Neural Networks
    VanRullen, Rufin
    [J]. FRONTIERS IN PSYCHOLOGY, 2017, 8
  • [36] Evolving a cooperative population of neural networks by minimizing mutual information
    Liu, Y
    Yao, X
    Zhao, QF
    Higuchi, T
    [J]. PROCEEDINGS OF THE 2001 CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1 AND 2, 2001, : 384 - 389
  • [37] An Image Quality Index Based on Mutual Information and Neural Networks
    Deriche, Mohamed
    [J]. ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2014, 39 (03) : 1983 - 1993
  • [38] An Image Quality Index Based on Mutual Information and Neural Networks
    Mohamed Deriche
    [J]. Arabian Journal for Science and Engineering, 2014, 39 : 1983 - 1993
  • [39] Textile defects Identification Based on Neural Networks and Mutual Information
    Abdel-Azim, Gamil
    Nasri, Salem
    [J]. 2013 INTERNATIONAL CONFERENCE ON COMPUTER APPLICATIONS TECHNOLOGY (ICCAT), 2013,
  • [40] Mutual Information Generation for Improving Generalization and Interpretation in Neural Networks
    Kamimura, Ryotaro
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,