ON NETWORK SCIENCE AND MUTUAL INFORMATION FOR EXPLAINING DEEP NEURAL NETWORKS

被引:0
|
作者
Davis, Brian [1 ]
Bhatt, Umang [1 ,2 ]
Bhardwaj, Kartikeya [1 ,3 ]
Marculescu, Radu [1 ,4 ]
Moura, Jose M. P. [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[2] Univ Cambridge, Cambridge, England
[3] Arm Inc, Cambridge, England
[4] Univ Texas Austin, Austin, TX 78712 USA
关键词
deep learning; information theory; network science; interpretability;
D O I
10.1109/icassp40776.2020.9053078
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we present a new approach to interpret deep learning models. By coupling mutual information with network science, we explore how information flows through feedforward networks. We show that efficiently approximating mutual information allows us to create an information measure that quantifies how much information flows between any two neurons of a deep learning model. To that end, we propose NIF, Neural Information Flow, a technique for codifying information flow that exposes deep learning model internals and provides feature attributions.
引用
收藏
页码:8399 / 8403
页数:5
相关论文
共 50 条
  • [1] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [3] Detecting Adversarial Examples on Deep Neural Networks With Mutual Information Neural Estimation
    Gao, Song
    Wang, Ruxin
    Wang, Xiaoxuan
    Yu, Shui
    Dong, Yunyun
    Yao, Shaowen
    Zhou, Wei
    [J]. IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2023, 20 (06) : 5168 - 5181
  • [4] Hibernated Backdoor: A Mutual Information Empowered Backdoor Attack to Deep Neural Networks
    Ning, Rui
    Li, Jiang
    Xin, Chunsheng
    Wu, Hongyi
    Wang, Chonggang
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 10309 - 10318
  • [5] Mutual exclusivity as a challenge for deep neural networks
    Gandhi, Kanishk
    Lake, Brenden
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Mutual Information Maximization in Graph Neural Networks
    Di, Xinhan
    Yu, Pengqian
    Bu, Rui
    Sun, Mingchao
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [7] Mutual information, neural networks and the renormalization group
    Koch-Janusz, Maciej
    Ringel, Zohar
    [J]. NATURE PHYSICS, 2018, 14 (06) : 578 - 582
  • [8] Mutual information, neural networks and the renormalization group
    Maciej Koch-Janusz
    Zohar Ringel
    [J]. Nature Physics, 2018, 14 : 578 - 582
  • [9] Explaining Deep Neural Networks in medical imaging context
    Rguibi, Zakaria
    Hajami, AbdelMajid
    Dya, Zitouni
    [J]. 2021 IEEE/ACS 18TH INTERNATIONAL CONFERENCE ON COMPUTER SYSTEMS AND APPLICATIONS (AICCSA), 2021,
  • [10] Explaining the Behavior of Neuron Activations in Deep Neural Networks
    Wang, Longwei
    Wang, Chengfei
    Li, Yupeng
    Wang, Rui
    [J]. AD HOC NETWORKS, 2021, 111