A Stochastic Computational Multi-Layer Perceptron with Backward Propagation

被引:66
|
作者
Liu, Yidong [1 ]
Liu, Siting [1 ]
Wang, Yanzhi [2 ]
Lombardi, Fabrizio [3 ]
Han, Jie [1 ]
机构
[1] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 1H9, Canada
[2] Syracuse Univ, Elect Engn & Comp Sci Dept, Syracuse, NY 13244 USA
[3] Northeastern Univ, Dept Elect & Comp Engn, Boston, MA 02115 USA
基金
加拿大自然科学与工程研究理事会;
关键词
Stochastic computation; binary search; neural network; probability estimator; multi-layer perceptron; HARDWARE IMPLEMENTATION; NEURAL-NETWORK;
D O I
10.1109/TC.2018.2817237
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Stochastic computation has recently been proposed for implementing artificial neural networks with reduced hardware and power consumption, but at a decreased accuracy and processing speed. Most existing implementations are based on pre-training such that the weights are predetermined for neurons at different layers, thus these implementations lack the ability to update the values of the network parameters. In this paper, a stochastic computational multi-layer perceptron (SC-MLP) is proposed by implementing the backward propagation algorithm for updating the layer weights. Using extended stochastic logic (ESL), a reconfigurable stochastic computational activation unit (SCAU) is designed to implement different types of activation functions such as the tanh and the rectifier function. A triple modular redundancy (TMR) technique is employed for reducing the random fluctuations in stochastic computation. A probability estimator (PE) and a divider based on the TMR and a binary search algorithm are further proposed with progressive precision for reducing the required stochastic sequence length. Therefore, the latency and energy consumption of the SC-MLP are significantly reduced. The simulation results show that the proposed design is capable of implementing both the training and inference processes. For the classification of nonlinearly separable patterns, at a slight loss of accuracy by 1.32-1.34 percent, the proposed design requires only 28.5-30.1 percent of the area and 18.9-23.9 percent of the energy consumption incurred by a design using floating point arithmetic. Compared to a fixed-point implementation, the SC-MLP consumes a smaller area (40.7-45.5 percent) and a lower energy consumption (38.0-51.0 percent) with a similar processing speed and a slight drop of accuracy by 0.15-0.33 percent. The area and the energy consumption of the proposed design is from 80.7-87.1 percent and from 71.9-93.1 percent, respectively, of a binarized neural network (BNN), with a similar accuracy.
引用
收藏
页码:1273 / 1286
页数:14
相关论文
共 50 条
  • [1] Investigation of Multi-Layer Perceptron with Propagation of Glial Pulse to Two Directions
    Ikuta, Chihiro
    Uwate, Yoko
    Nishio, Yoshifumi
    2012 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 2012), 2012, : 2099 - 2102
  • [2] Graph Attention Multi-Layer Perceptron
    Zhang, Wentao
    Yin, Ziqi
    Sheng, Zeang
    Li, Yang
    Ouyang, Wen
    Li, Xiaosen
    Tao, Yangyu
    Yang, Zhi
    Cui, Bin
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4560 - 4570
  • [3] Symbolic representation of a multi-layer perceptron
    Mouria-Beji, F
    ARTIFICIAL NEURAL NETS AND GENETIC ALGORITHMS, 2001, : 205 - 208
  • [4] Local design for multi-layer perceptron
    Xu, Li
    Zidonghua Xuebao/Acta Automatica Sinica, 1997, 23 (03): : 325 - 331
  • [5] Performance Comparison of Multi-layer Perceptron (Back Propagation, Delta Rule and Perceptron) algorithms in Neural Networks
    Alsmadi, Mutasem Khalil
    Bin Omar, Khairuddin
    Noah, Shahrul Azman
    Almarashdah, Ibrahim
    2009 IEEE INTERNATIONAL ADVANCE COMPUTING CONFERENCE, VOLS 1-3, 2009, : 296 - +
  • [6] Back Propagation Algoritham: The Best Algorithm Among the Multi-layer Perceptron Algorithm
    Alsmadi, Mutasem Khalil Sari
    Bin Omar, Khairuddin
    Noah, Shahrul Azman
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2009, 9 (04): : 378 - 383
  • [7] COMPUTATIONAL AND STATISTICAL THRESHOLDS IN MULTI-LAYER STOCHASTIC BLOCK MODELS
    Lei, Jing
    Zhang, Anru R.
    Zhu, Zihan
    ANNALS OF STATISTICS, 2024, 52 (05): : 2431 - 2455
  • [8] Tighter Guarantees for the Compressive Multi-layer Perceptron
    Kaban, Ata
    Thummanusarn, Yamonporn
    THEORY AND PRACTICE OF NATURAL COMPUTING (TPNC 2018), 2018, 11324 : 388 - 400
  • [9] Multi-Layer Perceptron with Pulse Glial Chain
    Ikuta, Chihiro
    Uwate, Yoko
    Nishio, Yoshifumi
    Yang, Guoan
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2016, E99A (03): : 742 - 755
  • [10] Multi-Layer Perceptron for Sleep Stage Classification
    Yulita, Intan Nurma
    Rosadi, Rudi
    Purwani, Sri
    Suryani, Mira
    2ND INTERNATIONAL CONFERENCE ON STATISTICS, MATHEMATICS, TEACHING, AND RESEARCH 2017, 2018, 1028