DNNViz: Training Evolution Visualization for Deep Neural Networks

被引:1
|
作者
Clavien, Gil [1 ]
Alberti, Michele [1 ]
Pondenkandath, Vinaychandran [1 ]
Ingold, Rolf [1 ]
Liwicki, Marcus [1 ,2 ]
机构
[1] Univ Fribourg, Document Image & Voice Anal Grp DIVA, Fribourg, Switzerland
[2] Lulea Univ Technol, Machine Learning Grp, Lulea, Sweden
基金
瑞士国家科学基金会;
关键词
D O I
10.1109/SDS.2019.00-13
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we present novel visualization strategies for inspecting, displaying, browsing, comparing, and visualizing deep neural networks (DNN) and their internal state during training. Despite their broad use across many fields of application, deep learning techniques are still often referred to as "black boxes". Trying to get a better understanding of these models and how they work is a thriving field of research. To this end, we contribute with a visualization mechanism designed explicitly to enable simple and efficient introspection for deep neural networks. The mechanism processes, computes, and displays neurons activation during the training of a deep neural network. We furthermore demonstrate the usefulness of this visualization technique through different use cases: class similarity detection, hints for network pruning and adversarial attack detection. We implemented this mechanism in an open source tool called DNNViz, which is integrated into DeepDIVA, a highly-functional PyTorch framework for reproducible experiments.
引用
收藏
页码:19 / 24
页数:6
相关论文
共 50 条
  • [41] Understanding and Training Deep Diagonal Circulant Neural Networks
    Araujo, Alexandre
    Negrevergne, Benjamin
    Chevaleyre, Yann
    Atif, Jamal
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 945 - 952
  • [42] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    [J]. FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [43] Relating Information Complexity and Training in Deep Neural Networks
    Gain, Alex
    Siegelmann, Hava
    [J]. MICRO- AND NANOTECHNOLOGY SENSORS, SYSTEMS, AND APPLICATIONS XI, 2019, 10982
  • [44] Decentralized trustless gossip training of deep neural networks
    Sajina, Robert
    Tankovic, Nikola
    Etinger, Darko
    [J]. 2020 43RD INTERNATIONAL CONVENTION ON INFORMATION, COMMUNICATION AND ELECTRONIC TECHNOLOGY (MIPRO 2020), 2020, : 1080 - 1084
  • [45] ON TRAINING DEEP NEURAL NETWORKS USING A STREAMING APPROACH
    Duda, Piotr
    Jaworski, Maciej
    Cader, Andrzej
    Wang, Lipo
    [J]. JOURNAL OF ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING RESEARCH, 2020, 10 (01) : 15 - 26
  • [46] SEQUENCE TRAINING AND ADAPTATION OF HIGHWAY DEEP NEURAL NETWORKS
    Lu, Liang
    [J]. 2016 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY (SLT 2016), 2016, : 461 - 466
  • [47] Disentangling feature and lazy training in deep neural networks
    Geiger, Mario
    Spigler, Stefano
    Jacot, Arthur
    Wyart, Matthieu
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2020, 2020 (11):
  • [48] Partial data permutation for training deep neural networks
    Cong, Guojing
    Zhang, Li
    Yang, Chih-Chieh
    [J]. 2020 20TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING (CCGRID 2020), 2020, : 728 - 735
  • [49] Training Deep Neural Networks in Situ with Neuromorphic Photonics
    Filipovich, Matthew J.
    Guo, Zhimu
    Marquez, Bicky A.
    Morison, Hugh D.
    Shastri, Bhavin J.
    [J]. 2020 IEEE PHOTONICS CONFERENCE (IPC), 2020,
  • [50] An Exploration on Temperature Term in Training Deep Neural Networks
    Si, Zhaofeng
    Qi, Honggang
    [J]. 2019 16TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED VIDEO AND SIGNAL BASED SURVEILLANCE (AVSS), 2019,