SFAO: Sign-Flipping-Aware Optimization for Early-Stopping of Binarized Neural Networks

被引:0
|
作者
Kang, Ju Yeon [1 ]
Ryu, Chang Ho [2 ]
Kang, Suk Bong [1 ]
Han, Tae Hee [2 ,3 ]
机构
[1] Sungkyunkwan Univ, Dept Elect & Comp Engn, Suwon 16419, South Korea
[2] Sungkyunkwan Univ, Dept Artificial Intelligence, Suwon 16419, South Korea
[3] Sungkyunkwan Univ, Dept Semicond Syst Engn, Suwon 16419, South Korea
关键词
Training; Computational efficiency; Neural networks; Computational modeling; Quantization (signal); Optimization; Backpropagation; Artificial intelligence; Machine learning; model compression; optimizer; efficient machine learning; binarized neural networks; layer freezing;
D O I
10.1109/ACCESS.2023.3332472
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the vital challenges for the binary neural networks (BNNs) is improving their inference performance by expanding their data representation capabilities for figuring out delicate patterns and nuances in the data. Addressing the explosive computational demands on neural network training is essential to guarantee sustainable development and scalable deployment. However, mitigating the increase in the computational cost during the training phase is critical for ensuring sustainability and scalability during deployment. In this study, an advanced sign-flipping-aware optimizer (SFAO) that focuses on BNNs was introduced to diminish the computational burden. SFAO balanced the model performance and computational cost through sign-flipping-aware updating rules throughout the training of BNNs. SFAO optimizer, tailored for BNNs with binary weight-specific updating rules, considerably reduced the computing resources needed for training on the CIFAR-10 dataset. Specifically, it surpassed the conventional full-precision updating rule by reducing the total instruction count by 21.89%. In contrast, SFAO showed a marginal 0.44% decline in the image classification accuracy relative to the updating rules for the full-precision parameters. Furthermore, the implementation of early stopping using the sign flip rate led to a notable reduction of 9.37% in the average computation time per network for the ImageNet dataset.
引用
收藏
页码:128306 / 128315
页数:10
相关论文
共 48 条
  • [1] Protein fold class prediction using neural networks with tailored early-stopping
    Wiebringhaus, T
    Igel, C
    Gebert, J
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 1693 - 1697
  • [2] An Early-Stopping Protocol for Computing Aggregate Functions in Sensor Networks
    Fernandez Anta, Antonio
    Mosteiro, Miguel A.
    Thraves, Christopher
    [J]. DISTRIBUTED COMPUTING, PROCEEDINGS, 2008, 5218 : 504 - 506
  • [3] An early-stopping protocol for computing aggregate functions in Sensor Networks
    Fernandez Anta, Antonio
    Mosteiro, Miguel A.
    Thraves, Christopher
    [J]. JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 2013, 73 (02) : 111 - 121
  • [4] An Early-stopping Protocol for Computing Aggregate Functions in Sensor Networks
    Fernandez Anta, Antonio
    Mosteiro, Miguel A.
    Thraves, Christopher
    [J]. IEEE 15TH PACIFIC RIM INTERNATIONAL SYMPOSIUM ON DEPENDABLE COMPUTING, PROCEEDINGS, 2009, : 357 - +
  • [5] Architecturing Binarized Neural Networks for Traffic Sign Recognition
    Postovan, Andreea
    Erascu, Madalina
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT I, 2023, 14254 : 87 - 98
  • [6] Traffic Sign Recognition with Binarized Multi-Scale Neural Networks
    Song, Xin
    You, Haitao
    Zhou, Shengqun
    Xie, Wanjun
    [J]. 2020 35TH YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), 2020, : 116 - 121
  • [7] Hardware Platform-Aware Binarized Neural Network Model Optimization
    Vo, Quang Hieu
    Asim, Faaiz
    Alimkhanuly, Batyrbek
    Lee, Seunghyun
    Kim, Lokwon
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (03):
  • [8] An Efficient Channel-Aware Sparse Binarized Neural Networks Inference Accelerator
    Liu, Qingliang
    Lai, Jinmei
    Gao, Jiabao
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2022, 69 (03) : 1637 - 1641
  • [9] Early stopping by correlating online indicators in neural networks
    Ferro, Manuel Vilares
    Mosquera, Yerai Doval
    Pena, Francisco J. Ribadas
    Bilbao, Victor M. Darriba
    [J]. NEURAL NETWORKS, 2023, 159 : 109 - 124
  • [10] Effective Early Stopping of Point Cloud Neural Networks
    Zoumpekas, Thanasis
    Salamo, Maria
    Puig, Anna
    [J]. MODELING DECISIONS FOR ARTIFICIAL INTELLIGENCE, MDAI 2022, 2022, 13408 : 156 - 167