Universal and Succinct Source Coding of Deep Neural Networks

被引:0
|
作者
Basu S. [1 ,2 ]
Varshney L.R. [2 ]
机构
[1] Indian Institute of Technology Kanpur, Department of Electrical Engineering, Kanpur
[2] University of Illinois at Urbana-Champaign, Coordinated Science Laboratory, Department of Electrical and Computer Engineering, Urbana, 61801, IL
关键词
artificial neural networks; entropy coding; Neural network compression; source coding;
D O I
10.1109/JSAIT.2023.3261819
中图分类号
学科分类号
摘要
Deep neural networks have shown incredible performance for inference tasks in a variety of domains, but require significant storage space, which limits scaling and use for on-device intelligence. This paper is concerned with finding universal lossless compressed representations of deep feedforward networks with synaptic weights drawn from discrete sets, and directly performing inference without full decompression. The basic insight that allows less rate than naïve approaches is recognizing that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner. We also provide experimental results of our approach on several standard datasets. © 2020 IEEE.
引用
收藏
页码:732 / 745
页数:13
相关论文
共 50 条
  • [21] Deep neural networks watermark via universal deep hiding and metric learning
    Ye, Zhicheng
    Zhang, Xinpeng
    Feng, Guorui
    NEURAL COMPUTING & APPLICATIONS, 2024, 36 (13): : 7421 - 7438
  • [22] Universal backdoor attack on deep neural networks for malware detection
    Zhang, Yunchun
    Feng, Fan
    Liao, Zikun
    Li, Zixuan
    Yao, Shaowen
    APPLIED SOFT COMPUTING, 2023, 143
  • [23] Deep Neural Networks, Generic Universal Interpolation, and Controlled ODEs
    Cuchiero, Christa
    Larsson, Martin
    Teichmann, Josef
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (03): : 901 - 919
  • [24] GRAPH EXPANSIONS OF DEEP NEURAL NETWORKS AND THEIR UNIVERSAL SCALING LIMITS
    Cirone, Nicola Muça
    Hamdan, Jad
    Salvi, Cristopher
    arXiv,
  • [25] A Universal VAD Based on Jointly Trained Deep Neural Networks
    Wang, Qing
    Du, Jun
    Bao, Xiao
    Wang, Zi-Rui
    Dai, Li-Rong
    Lee, Chin-Hui
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2282 - 2286
  • [26] A universal strategy for smoothing deceleration in deep graph neural networks
    Cheng, Qi
    Long, Lang
    Xu, Jiayu
    Zhang, Min
    Han, Shuangze
    Zhao, Chengkui
    Feng, Weixing
    NEURAL NETWORKS, 2025, 185
  • [27] Inconsistent illusory motion in predictive coding deep neural networks
    Kirubeswaran, O. R.
    Storrs, Katherine R.
    VISION RESEARCH, 2023, 206
  • [28] CCS Coding of Discharge Diagnoses via Deep Neural Networks
    Helwe, Chadi
    Elbassuoni, Shady
    Geha, Mirabelle
    Hitti, Eveline
    Obermeyer, Carla Makhlouf
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON DIGITAL HEALTH (DH'17), 2017, : 175 - 179
  • [29] Compression of Deep Neural Networks with Structured Sparse Ternary Coding
    Yoonho Boo
    Wonyong Sung
    Journal of Signal Processing Systems, 2019, 91 : 1009 - 1019
  • [30] Compression of Deep Neural Networks with Structured Sparse Ternary Coding
    Boo, Yoonho
    Sung, Wonyong
    JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2019, 91 (09): : 1009 - 1019