An adiabatic method to train binarized artificial neural networks

被引:2
|
作者
Zhao, Yuansheng [1 ,2 ,3 ]
Xiao, Jiang [1 ,2 ,4 ,5 ]
机构
[1] Fudan Univ, Dept Phys, Shanghai 200433, Peoples R China
[2] Fudan Univ, State Key Lab Surface Phys, Shanghai 200433, Peoples R China
[3] Univ Tokyo, Dept Phys, Tokyo, Japan
[4] Fudan Univ, Inst Nanoelect Devices & Quantum Comp, Shanghai 200433, Peoples R China
[5] Shanghai Qi Zhi Inst, Shanghai 200232, Peoples R China
关键词
D O I
10.1038/s41598-021-99191-2
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
An artificial neural network consists of neurons and synapses. Neuron gives output based on its input according to non-linear activation functions such as the Sigmoid, Hyperbolic Tangent (Tanh), or Rectified Linear Unit (ReLU) functions, etc.. Synapses connect the neuron outputs to their inputs with tunable real-valued weights. The most resource-demanding operations in realizing such neural networks are the multiplication and accumulate ( MAC) operations that compute the dot product between real-valued outputs from neurons and the synapses weights. The efficiency of neural networks can be drastically enhanced if the neuron outputs and/or the weights can be trained to take binary values +/- 1 only, for which the MAC can be replaced by the simple XNOR operations. In this paper, we demonstrate an adiabatic training method that can binarize the fully-connected neural networks and the convolutional neural networks without modifying the network structure and size. This adiabatic training method only requires very minimal changes in training algorithms, and is tested in the following four tasks: the recognition of hand-writing numbers using a usual fully-connected network, the cat-dog recognition and the audio recognition using convolutional neural networks, the image recognition with 10 classes (CIFAR-10) using ResNet-20 and VGG-Small networks. In all tasks, the performance of the binary neural networks trained by the adiabatic method are almost identical to the networks trained using the conventional ReLU or Sigmoid activations with real-valued activations and weights. This adiabatic method can be easily applied to binarize different types of networks, and will increase the computational efficiency considerably and greatly simplify the deployment of neural networks.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] An adiabatic method to train binarized artificial neural networks
    Yuansheng Zhao
    Jiang Xiao
    [J]. Scientific Reports, 11
  • [2] A Quotient Gradient Method to Train Artificial Neural Networks
    Khodabandehlou, Hamid
    Fadali, Mohammad Sami
    [J]. 2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 2576 - 2581
  • [3] Binarized Neural Networks
    Hubara, Itay
    Courbariaux, Matthieu
    Soudry, Daniel
    El-Yaniv, Ran
    Bengio, Yoshua
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [4] Memristor Binarized Neural Networks
    Khoa Van Pham
    Tien Van Nguyen
    Son Bao Tran
    Nam, Hyunkyung
    Lee, Mi Jung
    Choi, Byung Joon
    Son Ngoc Truong
    Min, Kyeong-Sik
    [J]. JOURNAL OF SEMICONDUCTOR TECHNOLOGY AND SCIENCE, 2018, 18 (05) : 568 - 577
  • [5] Fast Simulation Method for Analog Deep Binarized Neural Networks
    Lee, Chaeun
    Kim, Jaehyun
    Kim, Jihun
    Hwang, Cheol Seong
    Choi, Kiyoung
    [J]. 2019 INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2019, : 293 - 294
  • [6] A Review of Binarized Neural Networks
    Simons, Taylor
    Lee, Dah-Jye
    [J]. ELECTRONICS, 2019, 8 (06)
  • [7] The use of artificial neural networks in adiabatic curves modeling
    Trtnik, Gregor
    Kavcic, Franci
    Turk, Goran
    [J]. AUTOMATION IN CONSTRUCTION, 2008, 18 (01) : 10 - 15
  • [8] Synaptic metaplasticity in binarized neural networks
    Axel Laborieux
    Maxence Ernoult
    Tifenn Hirtzlin
    Damien Querlioz
    [J]. Nature Communications, 12
  • [9] Synaptic metaplasticity in binarized neural networks
    Laborieux, Axel
    Ernoult, Maxence
    Hirtzlin, Tifenn
    Querlioz, Damien
    [J]. NATURE COMMUNICATIONS, 2021, 12 (01)
  • [10] Neural Spike Sorting Using Binarized Neural Networks
    Valencia, Daniel
    Alimohammad, Amir
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2021, 29 : 206 - 214