Stochastic Implementation of the Activation Function for Artificial Neural Networks

被引:0
|
作者
Yeo, Injune [1 ]
Gi, Sang-gyun [1 ]
Lee, Byung-geun [1 ]
Chu, Myonglae [2 ]
机构
[1] Gwangju Inst Sci & Technol, Sch Elect Engn & Comp Sci, Gwangju, South Korea
[2] IMEC, Interuniv Microelect Ctr, Imager SoC Team, Leuven, Belgium
基金
新加坡国家研究基金会;
关键词
Artificial neural network; nonlinear activation function; neuromorphic; stochastic neuron; ananlog computing element;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the key elements in an artificial neural networks (ANNs) is the activation function (AF), that converts the weighted sum of a neuron's input into a probability of firing rate. The hardware implementation of the AF requires complicated circuits and involves a considerable amount of power dissipation. This renders the integration of a number of neurons onto a single chip difficult. This paper presents circuit techniques for realizing four different types of AFs, such as the step, identity, rectified-linear unit (ReLU), and the sigmoid, based on stochastic computing. The proposed AF circuits are simpler and consume considerably lesser power than the existing ones. A handwritten digit recognition system employing the AF circuits has been simulated for verifying the effectiveness of the techniques.
引用
收藏
页码:440 / 443
页数:4
相关论文
共 50 条
  • [1] Parabola As an Activation Function of Artificial Neural Networks
    Khachumov, M. V.
    Emelyanova, Yu. G.
    SCIENTIFIC AND TECHNICAL INFORMATION PROCESSING, 2024, 51 (05) : 471 - 477
  • [2] A novel type of activation function in artificial neural networks: Trained activation function
    Ertugrul, Omer Faruk
    NEURAL NETWORKS, 2018, 99 : 148 - 157
  • [3] Artificial Neural Networks Activation Function HDL Coder
    Namin, Ashkan Hosseinzadeh
    Leboeuf, Karl
    Wu, Huapeng
    Ahmadi, Majid
    2009 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY, 2009, : 387 - 390
  • [4] FPGA Realization of Activation Function for Artificial Neural Networks
    Saichand, Venakata
    Nirmala, Devi M.
    Arumugam., S.
    Mohankumar, N.
    ISDA 2008: EIGHTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 3, PROCEEDINGS, 2008, : 159 - 164
  • [5] A Hybrid Chaotic Activation Function for Artificial Neural Networks
    Reid, Siobhan
    Ferens, Ken
    ADVANCES IN ARTIFICIAL INTELLIGENCE AND APPLIED COGNITIVE COMPUTING, 2021, : 1097 - 1105
  • [6] A Quantum Activation Function for Neural Networks: Proposal and Implementation
    Kumar, Saurabh
    Dangwal, Siddharth
    Adhikary, Soumik
    Bhowmik, Debanjan
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [7] FPGA Implementation of Function Approximation Module for Artificial Neural Networks
    Bohrn, Marek
    Fujcik, Lukas
    Vrba, Radimir
    TSP 2010: 33RD INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING, 2010, : 142 - 145
  • [8] Activation Function Perturbations in Artificial Neural Networks Effects on Robustness
    Sostre, Justin
    Cahill, Nathan
    Merkel, Cory
    2024 IEEE WESTERN NEW YORK IMAGE AND SIGNAL PROCESSING WORKSHOP, WNYISPW 2024, 2024,
  • [9] Efficient Implementation of Activation Function on FPGA for Accelerating Neural Networks
    Qian, Kai
    Liu, Yinqiu
    Zhang, Zexu
    Wang, Kun
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [10] High accuracy FPGA activation function implementation for neural networks
    Hajduk, Zbigniew
    NEUROCOMPUTING, 2017, 247 : 59 - 61