Structured Matrices and Their Application in Neural Networks: A Survey

被引:0
|
作者
Matthias Kissel
Klaus Diepold
机构
[1] Technical University of Munich,TUM School of Computation, Information and Technology
来源
New Generation Computing | 2023年 / 41卷
关键词
Matrix structures; Neural network; Efficient propagation; Fast inference;
D O I
暂无
中图分类号
学科分类号
摘要
Modern neural network architectures are becoming larger and deeper, with increasing computational resources needed for training and inference. One approach toward handling this increased resource consumption is to use structured weight matrices. By exploiting structures in weight matrices, the computational complexity for propagating information through the network can be reduced. However, choosing the right structure is not trivial, especially since there are many different matrix structures and structure classes. In this paper, we give an overview over the four main matrix structure classes, namely semiseparable matrices, matrices of low displacement rank, hierarchical matrices and products of sparse matrices. We recapitulate the definitions of each structure class, present special structure subclasses, and provide references to research papers in which the structures are used in the domain of neural networks. We present two benchmarks comparing the classes. First, we benchmark the error for approximating different test matrices. Second, we compare the prediction performance of neural networks in which the weight matrix of the last layer is replaced by structured matrices. After presenting the benchmark results, we discuss open research questions related to the use of structured matrices in neural networks and highlight future research directions.
引用
收藏
页码:697 / 722
页数:25
相关论文
共 50 条
  • [1] Structured Matrices and Their Application in Neural Networks: A Survey
    Kissel, Matthias
    Diepold, Klaus
    NEW GENERATION COMPUTING, 2023, 41 (03) : 697 - 722
  • [2] Tamp: A Library for Compact Deep Neural Networks with Structured Matrices
    Gong, Bingchen
    Jou, Brendan
    Yu, Felix
    Chang, Shih-Fu
    MM'16: PROCEEDINGS OF THE 2016 ACM MULTIMEDIA CONFERENCE, 2016, : 1206 - 1209
  • [3] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [4] RadiX-Net: Structured Sparse Matrices for Deep Neural Networks
    Robinett, Ryan A.
    Kepner, Jeremy
    2019 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2019, : 268 - 274
  • [5] The inverse eigenvalue problem of structured matrices from the design of Hopfield neural networks
    Zhu, Lei
    Xu, Wei-wei
    APPLIED MATHEMATICS AND COMPUTATION, 2016, 273 : 1 - 7
  • [6] Neural Structured Learning: Training Neural Networks with Structured Signals
    Gopalan, Arjun
    Juan, Da-Cheng
    Magalhaes, Cesar Ilharco
    Ferng, Chun-Sung
    Heydon, Allan
    Lu, Chun-Ta
    Pham, Philip
    Yu, George
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 3501 - 3502
  • [7] Neural Structured Learning: Training Neural Networks with Structured Signals
    Gopalan, Arjun
    Juan, Da-Cheng
    Magalhaes, Cesar Ilharco
    Ferng, Chun-Sung
    Heydon, Allan
    Lu, Chun-Ta
    Pham, Philip
    Yu, George
    Fan, Yicheng
    Wang, Yueqi
    WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 1150 - 1153
  • [8] A Survey on Application of Neural Networks in Energy Conservation of Wireless Sensor Networks
    Enami, Neda
    Moghadam, Reza Askari
    Haghighat, Abolfazl
    RECENT TRENDS IN WIRELESS AND MOBILE NETWORKS, 2010, 84 : 283 - 294
  • [9] RANDOM MATRICES AND NEURAL NETWORKS
    LUCA, AD
    RICCIARD.LM
    VASUDEVA.R
    KYBERNETIK, 1970, 6 (05): : 163 - +
  • [10] COMPUTING WITH STRUCTURED NEURAL NETWORKS
    FELDMAN, JA
    FANTY, MA
    GODDARD, NH
    COMPUTER, 1988, 21 (03) : 91 - 103