Deep Networks as Paths on the Manifold of Neural Representations

被引:0
|
作者
Lange, Richard D. [1 ]
Kwok, Devin [2 ]
Matelsky, Jordan [1 ]
Wang, Xinyue [1 ]
Rolnick, David [2 ]
Kording, Konrad P. [1 ]
机构
[1] Univ Penn, Dept Neurobiol, Philadelphia, PA 19104 USA
[2] McGill Unviers, Mila Quebec AI Inst, Montreal, PQ, Canada
关键词
FRAMEWORK; KERNELS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks implement a sequence of layer-by-layer operations that are each relatively easy to understand, but the resulting overall computation is generally difficult to understand. An intuitive hypothesis is that the role of each layer is to reformat information to reduce the "distance" to the desired outputs. With this spatial analogy, the layer-wise computation implemented by a deep neural network can be viewed as a path along a high-dimensional manifold of neural representations. With this framework, each hidden layer transforms its inputs by taking a step of a particular size and direction along the manifold, ideally moving towards the desired network outputs. We formalize this intuitive idea by leveraging recent advances in metric representational similarity. We extend existing representational distance methods by defining and characterizing the manifold that neural representations live on, allowing us to calculate quantities like the shortest path or tangent direction separating representations between hidden layers of a network or across different networks. We then demonstrate these tools by visualizing and comparing the paths taken by a collection of trained neural networks with a variety of architectures, finding systematic relationships between model depth and width, and properties of their paths.
引用
收藏
页数:31
相关论文
共 50 条
  • [1] Manifold Regularized Deep Neural Networks
    Tomar, Vikrant Singh
    Rose, Richard C.
    [J]. 15TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2014), VOLS 1-4, 2014, : 348 - 352
  • [2] Deep Neural Networks for Learning Graph Representations
    Cao, Shaosheng
    Lu, Wei
    Xu, Qiongkai
    [J]. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1145 - 1152
  • [3] Exploring Internal Representations of Deep Neural Networks
    Despraz, Jeremie
    Gomez, Stephane
    Satizabal, Hector F.
    Pena-Reyes, Carlos Andres
    [J]. COMPUTATIONAL INTELLIGENCE, IJCCI 2017, 2019, 829 : 119 - 138
  • [4] Fast Kinodynamic Planning on the Constraint Manifold With Deep Neural Networks
    Kicki, Piotr
    Liu, Puze
    Tateo, Davide
    Bou-Ammar, Haitham
    Walas, Krzysztof
    Skrzypczynski, Piotr
    Peters, Jan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2024, 40 : 277 - 297
  • [5] Deep Convolutional Neural Networks based on Manifold for Smoke Recognition
    Cheng, Ming
    Ma, Pei
    He, Ruhan
    Chen, Jia
    Zhang, Zili
    Huang, Jin
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Face Space Representations in Deep Convolutional Neural Networks
    O'Toole, Alice J.
    Castillo, Carlos D.
    Parde, Connor J.
    Hill, Matthew Q.
    Chellappa, Rama
    [J]. TRENDS IN COGNITIVE SCIENCES, 2018, 22 (09) : 794 - 809
  • [7] State-Space Representations of Deep Neural Networks
    Hauser, Michael
    Gunn, Sean
    Saab, Samer, Jr.
    Ray, Asok
    [J]. NEURAL COMPUTATION, 2019, 31 (03) : 538 - 554
  • [8] Reframing Neural Networks: Deep Structure in Overcomplete Representations
    Murdock, Calvin
    Cazenavette, George
    Lucey, Simon
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 964 - 979
  • [9] Intrinsic dimension of data representations in deep neural networks
    Ansuini, Alessio
    Laio, Alessandro
    Macke, Jakob H.
    Zoccolan, Davide
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Checking Robustness of Representations Learned by Deep Neural Networks
    Szyc, Kamil
    Walkowiak, Tomasz
    Maciejewski, Henryk
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: APPLIED DATA SCIENCE TRACK, PT V, 2021, 12979 : 399 - 414