Investigating Learning in Deep Neural Networks Using Layer-Wise Weight Change

被引:0
|
作者
Agrawal, Ayush Manish [1 ,5 ,6 ]
Tendle, Atharva [1 ,5 ,6 ]
Sikka, Harshvardhan [2 ,5 ,6 ]
Singh, Sahib [4 ,5 ,6 ]
Kayid, Amr [3 ,5 ,6 ]
机构
[1] Univ Nebraska, Lincoln, NE 68588 USA
[2] Georgia Inst Technol, Atlanta, GA 30332 USA
[3] German Univ Cairo, Cairo, Egypt
[4] Ford R&A, Dearborn, MI USA
[5] Manifold Comp, 15805 Oakridge Rd, Morgan Hill, CA 95037 USA
[6] OpenMined, Oxford, Oxon, England
来源
关键词
Deep neural networks; Relative weight change; Convolutional neural networks; Learning trends;
D O I
10.1007/978-3-030-80126-7_48
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Understanding the learning dynamics of deep neural networks is of significant interest to the research community as it can provide insights into the black box nature of neural nets. In this work, we conduct a study which analyzes layer-wise learning trends by measuring the relative change in the weights of a Deep Neural Net during training. Through our controlled yet exhaustive set of experiments we were able to identify key trends which could lead to better understanding of how neural networks learn and make way for better training regimes. In our work we explore the learning trends in ubiquitous convolutional neural networks and datasets. Our work provides a simple yet novel approach to interpreting neural networks and is different from previous investigative studies.
引用
收藏
页码:678 / 693
页数:16
相关论文
共 50 条
  • [1] Layer-Wise Weight Decay for Deep Neural Networks
    Ishii, Masato
    Sato, Atsushi
    [J]. IMAGE AND VIDEO TECHNOLOGY (PSIVT 2017), 2018, 10749 : 276 - 289
  • [2] Collaborative Layer-Wise Discriminative Learning in Deep Neural Networks
    Jin, Xiaojie
    Chen, Yunpeng
    Dong, Jian
    Feng, Jiashi
    Yan, Shuicheng
    [J]. COMPUTER VISION - ECCV 2016, PT VII, 2016, 9911 : 733 - 749
  • [3] A Layer-Wise Theoretical Framework for Deep Learning of Convolutional Neural Networks
    Huu-Thiet Nguyen
    Li, Sitan
    Cheah, Chien Chern
    [J]. IEEE ACCESS, 2022, 10 : 14270 - 14287
  • [4] LAYER-WISE INTERPRETATION OF DEEP NEURAL NETWORKS USING IDENTITY INITIALIZATION
    Kubota, Shohei
    Hayashi, Hideaki
    Hayase, Tomohiro
    Uchida, Seiichi
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3945 - 3949
  • [5] Stochastic Layer-Wise Precision in Deep Neural Networks
    Lacey, Griffin
    Taylor, Graham W.
    Areibi, Shawki
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 663 - 672
  • [6] Differential Evolution Based Layer-Wise Weight Pruning for Compressing Deep Neural Networks
    Wu, Tao
    Li, Xiaoyang
    Zhou, Deyun
    Li, Na
    Shi, Jiao
    [J]. SENSORS, 2021, 21 (03) : 1 - 20
  • [7] Voice Conversion Using Deep Neural Networks With Layer-Wise Generative Training
    Chen, Ling-Hui
    Ling, Zhen-Hua
    Liu, Li-Juan
    Dai, Li-Rong
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2014, 22 (12) : 1859 - 1872
  • [8] Unsupervised Layer-Wise Model Selection in Deep Neural Networks
    Ludovic, Arnold
    Helene, Paugam-Moisy
    Michele, Sebag
    [J]. ECAI 2010 - 19TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2010, 215 : 915 - 920
  • [9] REINFORCEMENT LEARNING-BASED LAYER-WISE QUANTIZATION FOR LIGHTWEIGHT DEEP NEURAL NETWORKS
    Jung, Juri
    Kim, Jonghee
    Kim, Youngeun
    Kim, Changick
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 3070 - 3074
  • [10] Learning to Prune Deep Neural Networks via Layer-wise Optimal Brain Surgeon
    Dong, Xin
    Chen, Shangyu
    Pan, Sinno Jialin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30