Universal structural patterns in sparse recurrent neural networks

被引:0
|
作者
Zhang, Xin-Jie [1 ,2 ,3 ]
Moore, Jack Murdoch [1 ,2 ,3 ]
Yan, Gang [1 ,2 ,3 ,4 ]
Li, Xiang [3 ,5 ]
机构
[1] Tongji Univ, MOE Key Lab Adv Microstruct Mat, Shanghai, Peoples R China
[2] Tongji Univ, Sch Phys Sci & Engn, Shanghai, Peoples R China
[3] Tongji Univ, MOE Frontiers Sci Ctr Intelligent Autonomous Syst, Natl Key Lab Autonomous Intelligent Unmanned Syst, Shanghai, Peoples R China
[4] Chinese Acad Sci, CAS Ctr Excellence Brain Sci & Intelligence Techno, Shanghai, Peoples R China
[5] Tongji Univ, Coll Elect & Informat Engn, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
MOTIFS;
D O I
10.1038/s42005-023-01364-0
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Sparse neural networks can achieve performance comparable to fully connected networks but need less energy and memory, showing great promise for deploying artificial intelligence in resource-limited devices. While significant progress has been made in recent years in developing approaches to sparsify neural networks, artificial neural networks are notorious as black boxes, and it remains an open question whether well-performing neural networks have common structural features. Here, we analyze the evolution of recurrent neural networks (RNNs) trained by different sparsification strategies and for different tasks, and explore the topological regularities of these sparsified networks. We find that the optimized sparse topologies share a universal pattern of signed motifs, RNNs evolve towards structurally balanced configurations during sparsification, and structural balance can improve the performance of sparse RNNs in a variety of tasks. Such structural balance patterns also emerge in other state-of-the-art models, including neural ordinary differential equation networks and continuous-time RNNs. Taken together, our findings not only reveal universal structural features accompanying optimized network sparsification but also offer an avenue for optimal architecture searching. Deep neural networks have shown remarkable success in application areas across physical sciences and engineering science and finding such networks that can work efficiently with less connections (weight parameters) without sacrificing performance is thus of great interest. In this work the authors show that a large number of such efficient recurrent neural networks display certain connectivity patterns in their structure.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [31] Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function
    Wen, Hongsong
    He, Xing
    Huang, Tingwen
    NEURAL NETWORKS, 2022, 153 : 1 - 12
  • [32] Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks
    Zang, Ke
    Wu, Wenqi
    Luo, Wei
    SENSORS, 2021, 21 (19)
  • [33] A family of universal recurrent networks
    Koiran, P
    THEORETICAL COMPUTER SCIENCE, 1996, 168 (02) : 473 - 480
  • [34] Universal Reliability Bounds for Sparse Networks
    Romero, Pablo
    IEEE TRANSACTIONS ON RELIABILITY, 2022, 71 (01) : 359 - 369
  • [35] GraLSP: Graph Neural Networks with Local Structural Patterns
    Jin, Yilun
    Song, Guojie
    Shi, Chuan
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4361 - 4368
  • [36] Diagonal Recurrent Neural Networks for MDOF Structural Vibration Control
    Gu, Z. Q.
    Oyadiji, S. O.
    JOURNAL OF VIBRATION AND ACOUSTICS-TRANSACTIONS OF THE ASME, 2008, 130 (06):
  • [37] Implementation of Universal Computation via Small Recurrent Finite Precision Neural Networks
    Hobbs, J. Nicholas
    Siegelmann, Hava
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [38] Universal Analysis Method for Stability of Recurrent Neural Networks with Different Multiple Delays
    Wang, Zhanshan
    Zhang, Enlin
    Yun, Kuo
    Zhang, Huaguang
    ADVANCES IN NEURAL NETWORKS - ISNN 2011, PT I, 2011, 6675 : 148 - 157
  • [39] Functional and Structural Features of Recurrent Neural Networks with Controlled Elements
    Osipov, Vasiliy
    Nikiforov, Viktor
    ADVANCES IN NEURAL NETWORKS - ISNN 2019, PT I, 2019, 11554 : 133 - 140
  • [40] RETURNN: TUE RWTU EXTENSIBLE TRAINING FRAMEWORK FOR UNIVERSAL RECURRENT NEURAL NETWORKS
    Doetsch, Patrick
    Zeyer, Albert
    Voigtlaender, Paul
    Kulikov, Ilia
    Schlueter, Ralf
    Ney, Hermann
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5345 - 5349