Compressing Fully Connected Layers using Kronecker Tensor Decomposition

被引:0
|
作者
Chen, Shaowu [1 ]
Sun, Weize [1 ]
Huang, Lei [1 ]
Yang, Xin [1 ]
Huang, Junhao [1 ]
机构
[1] Shenzhen Univ, Guangdong Lab Artificial Intelligence & Cyber Eco, Shenzhen, Peoples R China
关键词
deep neural network; fully connected layers; kronecker tensor decomposition;
D O I
10.1109/iccsnt47585.2019.8962432
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In recent years, deep neural networks have made great achievements in many fields such as data science and image processing. Models of deep neural networks require pretty large memory storage resources with high computational complexity. In particular, the great number of parameters of fully connected layers prevents the application of deep neural networks in mobile devices. It is shown redundancy and low rank structure inside weight matrices of fully connected layers, thus methods of tensor decomposition can be applied to reduce the number of parameters. In this paper, a novel method for decomposition, Kronecker Tensor Decomposition with single pattern and multiple patterns is used to decompose the weight matrices of fully connected layers by using a series of sub-tensors to represent the original tensor with less amount of parameters. Experimental results showed that the proposed scheme can achieve a high compression ratio without significant accuracy loss.
引用
收藏
页码:308 / 312
页数:5
相关论文
共 50 条
  • [1] Compressing fully connected layers of deep neural networks using permuted features
    Nagaraju, Dara
    Chandrachoodan, Nitin
    IET COMPUTERS AND DIGITAL TECHNIQUES, 2023, 17 (3-4): : 149 - 161
  • [2] FULLY-CONNECTED TENSOR NETWORK DECOMPOSITION FOR ROBUST TENSOR COMPLETION PROBLEM
    Liu, Yun-Yang
    Zhao, Xi-Le
    Song, Guang-Jing
    Zheng, Yu-Bang
    Ng, Michael K.
    Huang, Ting-Zhu
    INVERSE PROBLEMS AND IMAGING, 2023, : 208 - 238
  • [3] Kronecker CP Decomposition With Fast Multiplication for Compressing RNNs
    Wang, Dingheng
    Wu, Bijiao
    Zhao, Guangshe
    Yao, Man
    Chen, Hengnu
    Deng, Lei
    Yan, Tianyi
    Li, Guoqi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2205 - 2219
  • [4] Compressing Deep Models using Multi Tensor Train Decomposition
    Yang, Xin
    Sun, Weize
    Huang, Lei
    Chen, Shaowu
    ICCAIS 2019: THE 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES, 2019,
  • [5] Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors
    Yu-Bang Zheng
    Ting-Zhu Huang
    Xi-Le Zhao
    Qibin Zhao
    Journal of Scientific Computing, 2022, 92
  • [6] Tensor Completion via Fully-Connected Tensor Network Decomposition with Regularized Factors
    Zheng, Yu-Bang
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Zhao, Qibin
    JOURNAL OF SCIENTIFIC COMPUTING, 2022, 92 (01)
  • [7] Fully-connected tensor network decomposition with gradient factors regularization for robust tensor completion
    Xiao, Bin
    Li, Heng-Chao
    Wang, Rui
    Zheng, Yu-Bang
    SIGNAL PROCESSING, 2025, 233
  • [8] Subquadratic Kronecker Regression with Applications to Tensor Decomposition
    Fahrbach, Matthew
    Fu, Gang
    Ghadiri, Mehrdad
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Tensor Decomposition for Compressing Recurrent Neural Network
    Tjandra, Andros
    Sakti, Sakriani
    Nakamura, Satoshi
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [10] Hyperspectral Image Completion Using Fully-Connected Extended Tensor Network Decomposition and Total Variation
    Li, Yao
    Zhang, Yujie
    Li, Hongwei
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2025, 18 : 7543 - 7558