Sparse Ternary Connect: Convolutional Neural Networks Using Ternarized Weights with Enhanced Sparsity

被引:0
|
作者
Jin, Canran [1 ]
Sun, Heming [1 ]
Kimura, Shinji [1 ]
机构
[1] Waseda Univ, Grad Sch Informat Prod & Syst, Wakamatsu Ku, 2-7 Hibikino, Kitakyushu, Fukuoka 8080135, Japan
关键词
D O I
暂无
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Convolutional Neural Networks (CNNs) are indispensable in a wide range of tasks to achieve state-of-the-art results. In this work, we exploit ternary weights in both inference and training of CNNs and further propose Sparse Ternary Connect (STC) where kernel weights in float value are converted to 1, -1 and 0 based on a new conversion rule with the controlled ratio of 0. STC can save hardware resource a lot with small degradation of precision. The experimental evaluation on 2 popular datasets (CIFAR-10 and SVHN) shows that the proposed method can reduce resource utilization (by 28.9% of LUT, 25.3% of FF, 97.5% of DSP and 88.7% of BRAM on Xilinx Kintex-7 FPGA) with less than 0.5% accuracy loss.
引用
收藏
页码:190 / 195
页数:6
相关论文
共 50 条
  • [1] Optimize Deep Convolutional Neural Network with Ternarized Weights and High Accuracy
    He, Zhezhi
    Gong, Boqing
    Fan, Deliang
    2019 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2019, : 913 - 921
  • [2] Structured Sparsity of Convolutional Neural Networks via Nonconvex Sparse Group Regularization
    Bui, Kevin
    Park, Fredrick
    Zhang, Shuai
    Qi, Yingyong
    Xin, Jack
    FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 6
  • [3] Sparse Convolutional Neural Networks
    Lu, Haoyuan
    Wang, Min
    Foroosh, Hassan
    Tappen, Marshall
    Penksy, Marianna
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 806 - 814
  • [4] Compressing Sparse Ternary Weight Convolutional Neural Networks for Efficient Hardware Acceleration
    Wi, Hyeonwook
    Kim, Hyeonuk
    Choi, Seungkyu
    Kim, Lee-Sup
    2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [5] Convolutional Neural Networks with Fixed Weights
    Folsom, Tyler C.
    VISAPP: PROCEEDINGS OF THE 16TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS - VOL. 5: VISAPP, 2021, : 516 - 523
  • [6] Exploring the Granularity of Sparsity in Convolutional Neural Networks
    Mao, Huizi
    Han, Song
    Pool, Jeff
    Li, Wenshuo
    Liu, Xingyu
    Wang, Yu
    Dally, William J.
    2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 1927 - 1934
  • [7] Supporting Compressed-Sparse Activations and Weights on SIMD-like Accelerator for Sparse Convolutional Neural Networks
    Lin, Chien-Yu
    Lai, Bo-Cheng
    2018 23RD ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE (ASP-DAC), 2018, : 105 - 110
  • [8] Super Sparse Convolutional Neural Networks
    Lu, Yao
    Lu, Guangming
    Zhang, Bob
    Xu, Yuanrong
    Li, Jinxing
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4440 - 4447
  • [9] Deep Embedded Vision Using Sparse Convolutional Neural Networks
    Pikoulis, Vassilis
    Mavrokefalidis, Christos
    Keramidas, Georgios
    Birbas, Michael
    Tsafas, Nikos
    Lalos, Aris S.
    ERCIM NEWS, 2020, (122): : 39 - 40
  • [10] Overfitting measurement of convolutional neural networks using trained network weights
    Satoru Watanabe
    Hayato Yamana
    International Journal of Data Science and Analytics, 2022, 14 : 261 - 278