Stochastic Implementation of the Activation Function for Artificial Neural Networks

被引:0
|
作者
Yeo, Injune [1 ]
Gi, Sang-gyun [1 ]
Lee, Byung-geun [1 ]
Chu, Myonglae [2 ]
机构
[1] Gwangju Inst Sci & Technol, Sch Elect Engn & Comp Sci, Gwangju, South Korea
[2] IMEC, Interuniv Microelect Ctr, Imager SoC Team, Leuven, Belgium
基金
新加坡国家研究基金会;
关键词
Artificial neural network; nonlinear activation function; neuromorphic; stochastic neuron; ananlog computing element;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the key elements in an artificial neural networks (ANNs) is the activation function (AF), that converts the weighted sum of a neuron's input into a probability of firing rate. The hardware implementation of the AF requires complicated circuits and involves a considerable amount of power dissipation. This renders the integration of a number of neurons onto a single chip difficult. This paper presents circuit techniques for realizing four different types of AFs, such as the step, identity, rectified-linear unit (ReLU), and the sigmoid, based on stochastic computing. The proposed AF circuits are simpler and consume considerably lesser power than the existing ones. A handwritten digit recognition system employing the AF circuits has been simulated for verifying the effectiveness of the techniques.
引用
收藏
页码:440 / 443
页数:4
相关论文
共 50 条
  • [21] Artificial neural networks as approximators of stochastic processes
    Belli, MR
    Conti, M
    Crippa, P
    Turchetti, C
    NEURAL NETWORKS, 1999, 12 (4-5) : 647 - 658
  • [22] Prospecting droughts with stochastic artificial neural networks
    Ochoa-Rivera, Juan Camilo
    JOURNAL OF HYDROLOGY, 2008, 352 (1-2) : 174 - 180
  • [23] Stochastic Artificial Neural Networks and Random Walks
    Windecker, Richard C.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 1134 - 1140
  • [24] HARDWARE IMPLEMENTATION OF STOCHASTIC SPIKING NEURAL NETWORKS
    Rossello, Josep L.
    Canals, Vincent
    Morro, Antoni
    Oliver, Antoni
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2012, 22 (04)
  • [25] Differentiation of neuron types by evolving activation function templates for artificial neural networks
    Mayer, HA
    Schwaiger, R
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1773 - 1778
  • [26] Dynamic Modification of Activation Function using the Backpropagation Algorithm in the Artificial Neural Networks
    Mercioni, Marina Adriana
    Tiron, Alexandru
    Holban, Stefan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2019, 10 (04) : 51 - 56
  • [27] Accelerating the Activation Function Selection for Hybrid Deep Neural Networks - FPGA Implementation
    Waseem, Shaik Mohammed
    Suraj, Alavala Venkata
    Roy, Subir Kumar
    2021 IEEE REGION 10 SYMPOSIUM (TENSYMP), 2021,
  • [28] EFFICIENT IMPLEMENTATION OF PIECEWISE LINEAR ACTIVATION FUNCTION FOR DIGITAL VLSI NEURAL NETWORKS
    MYERS, DJ
    HUTCHINSON, RA
    ELECTRONICS LETTERS, 1989, 25 (24) : 1662 - 1663
  • [29] Accuracy Analysis of Node Activation Function Based on Hardware Implementation of Artificial Neural Network
    Jiang, Nan
    Hou, Ligang
    Guo, Jia
    Zhang, Xinyi
    Lv, Ang
    2018 3RD IEEE INTERNATIONAL CONFERENCE ON INTEGRATED CIRCUITS AND MICROSYSTEMS (ICICM), 2018, : 278 - 281
  • [30] VLSI implementation of artificial neural networks - A survey
    Nirmaladevi M.
    Arumugam S.
    International Journal of Modelling and Simulation, 2010, 30 (02): : 148 - 154