Hardware Acceleration of CNN with One-Hot Quantization of Weights and Activations

被引:0
|
作者
Li, Gang [1 ,2 ]
Wang, Peisong [1 ]
Liu, Zejian [1 ,2 ]
Leng, Cong [1 ]
Cheng, Jian [1 ,2 ,3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Chinese Acad Sci, Ctr Excellence Brain Sci & Intelligence Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a novel one-hot representation for weights and activations in CNN model and demonstrate its benefits on hardware accelerator design. Specifically, rather than merely reducing the bitwidth, we quantize both weights and activations into n-bit integers that containing only one non-zero bit per value. In this way, the massive multiply and accumulates (MACs) are equivalent to additions of powers of two that can be efficiently calculated with histogram based computaitons. Experiments on the ImageNet classification task show that comparable accuracy can be obtained on our proposed One-Hot Networks (OHN) compared to conventional fixed-point networks. As case studies, we evaluate the efficacy of the one-hot data representation on two state-of-the-art CNN accelerators on FPGA, our preliminary results show that 50% and 68.5% resource saving can be achieved on DaDianNao and Laconic respectively. Besides, the one-hot optimized Laconic can further achieve an average speedup of 4.94x on AlexNet.
引用
收藏
页码:971 / 974
页数:4
相关论文
共 50 条
  • [1] A Novel Hardware Trojan Design Based on One-hot Code
    Wang, Di
    Wu, Liji
    Zhang, Xiangmin
    Wu, XingJun
    2018 6TH INTERNATIONAL SYMPOSIUM ON DIGITAL FORENSIC AND SECURITY (ISDFS), 2018, : 1 - 5
  • [2] Redistribution of Weights and Activations for AdderNet Quantization
    Nie, Ying
    Han, Kai
    Diao, Haikang
    Liu, Chuanjian
    Wu, Enhua
    Wang, Yunhe
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] An Adaptive Quantization Method for CNN Activations
    Wang, Yun
    Liu, Qiang
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [4] 基于One-Hot的CNN恶意代码检测技术
    傅依娴
    芦天亮
    马泽良
    计算机应用与软件, 2020, 37 (01) : 304 - 308+333
  • [5] One-Hot Graph Encoder Embedding
    Shen, Cencheng
    Wang, Qizhe
    Priebe, Carey E.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) : 7933 - 7938
  • [6] Scan synthesis for one-hot signals
    Mitra, S
    Avra, LJ
    McCluskey, EJ
    ITC - INTERNATIONAL TEST CONFERENCE 1997, PROCEEDINGS: INTEGRATING MILITARY AND COMMERCIAL COMMUNICATIONS FOR THE NEXT CENTURY, 1997, : 714 - 722
  • [7] Device Interoperability for Learned Image Compression with Weights and Activations Quantization
    Koyuncu, Esin
    Solovyev, Timofey
    Alshina, Elena
    Kaup, Andre
    2022 PICTURE CODING SYMPOSIUM (PCS), 2022, : 151 - 155
  • [8] ACCELERATE FPGA MACROS WITH ONE-HOT APPROACH
    KNAPP, SK
    ELECTRONIC DESIGN, 1990, 38 (17) : 71 - &
  • [9] One-Hot Residue Logarithmic Number Systems
    Arnold, Mark G.
    Kouretas, Ioannis
    Paliouras, Vassilis
    Morgan, Austin
    2019 IEEE 29TH INTERNATIONAL SYMPOSIUM ON POWER AND TIMING MODELING, OPTIMIZATION AND SIMULATION (PATMOS 2019), 2019, : 97 - 102
  • [10] Efficient Quantization for Neural Networks with Binary Weights and Low Bitwidth Activations
    Huang, Kun
    Ni, Bingbing
    Yang, Xiaokang
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3854 - 3861