Compressing neural networks with two-layer decoupling

被引:0
|
作者
De Jonghe, Joppe [1 ]
Usevich, Konstantin [2 ]
Dreesen, Philippe [3 ]
Ishteva, Mariya [1 ]
机构
[1] Katholieke Univ Leuven, Dept Comp Sci, Geel, Belgium
[2] Univ Lorraine, CNRS, Nancy, France
[3] Maastricht Univ, DACS, Maastricht, Netherlands
关键词
tensor; tensor decomposition; decoupling; compression; neural network; MODEL COMPRESSION; ACCELERATION;
D O I
10.1109/CAMSAP58249.2023.10403509
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
The single-layer decoupling problem has recently been used for the compression of neural networks. However, methods that are based on the single-layer decoupling problem only allow the compression of a neural network to a single flexible layer. As a result, compressing more complex networks leads to worse approximations of the original network due to only having one flexible layer. Having the ability to compress to more than one flexible layer thus allows to better approximate the underlying network compared to compression into only a single flexible layer. Performing compression into more than one flexible layer corresponds to solving a multi-layer decoupling problem. As a first step towards general multi-layer decoupling, this work introduces a method for solving the two-layer decoupling problem in the approximate case. This method enables the compression of neural networks into two flexible layers.
引用
下载
收藏
页码:226 / 230
页数:5
相关论文
共 50 条
  • [1] Plasticity of two-layer fast neural networks
    Alexeev, AA
    Dorogov, AY
    JOURNAL OF COMPUTER AND SYSTEMS SCIENCES INTERNATIONAL, 1999, 38 (05) : 786 - 791
  • [2] On the Structure of Two-Layer Cellular Neural Networks
    Ban, Jung-Chao
    Chang, Chih-Hung
    Lin, Song-Sun
    DIFFERENTIAL AND DIFFERENCE EQUATIONS WITH APPLICATI ONS, 2013, 47 : 265 - 273
  • [3] Templates and algorithms for two-layer cellular neural Networks
    Yang, ZH
    Nishio, Y
    Ushida, A
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1946 - 1951
  • [4] Two-Layer Feedback Neural Networks with Associative Memories
    Wu Gui-Kun
    Zhao Hong
    CHINESE PHYSICS LETTERS, 2008, 25 (11) : 3871 - 3874
  • [5] Two-layer stabilization of continuous neural networks with feedbacks
    Dudnikov, EE
    CYBERNETICS AND SYSTEMS, 2002, 33 (04) : 325 - 340
  • [6] Structural synthesis of fast two-layer neural networks
    A. Yu. Dorogov
    Cybernetics and Systems Analysis, 2000, 36 : 512 - 519
  • [7] Benign Overfitting in Two-layer Convolutional Neural Networks
    Cao, Yuan
    Chen, Zixiang
    Belkin, Mikhail
    Gu, Quanquan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [8] Sharp asymptotics on the compression of two-layer neural networks
    Amani, Mohammad Hossein
    Bombari, Simone
    Mondelli, Marco
    Pukdee, Rattana
    Rini, Stefano
    2022 IEEE INFORMATION THEORY WORKSHOP (ITW), 2022, : 588 - 593
  • [9] Structural synthesis of two-layer rapid neural networks
    Dorogov, A.Yu.
    Kibernetika i Sistemnyj Analiz, 2000, (04): : 47 - 57
  • [10] On the symmetries in the dynamics of wide two-layer neural networks
    Hajjar, Karl
    Chizat, Lenaic
    ELECTRONIC RESEARCH ARCHIVE, 2023, 31 (04): : 2175 - 2212