Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method

被引:0
|
作者
V. Ushenina, Inna [1 ]
机构
[1] Penza State Technol Univ, Penza, Russia
基金
俄罗斯科学基金会;
关键词
neural network; sigmoid function; FPGA; table-driven method; IMPLEMENTATION; APPROXIMATION;
D O I
10.17223/19988605/69/13
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the work, the sigmoid function is implemented using the bit-level mapping method. Within this method, inputs and outputs of the sigmoid function are represented in binary code in fixed-point format. Each output bit is separated from others and represented by a Boolean function of the input bits or its truth table. The possibilities of implementing sigmoid function output bit calculators on FPGA programmable logic blocks are assessed. Two implementation ways are analyzed: on the base of truth tables and on the base of minimized Boolean functions. All implemented circuits have equal bit widths of inputs and outputs to each other. The circuits based on truth tables have bit widths in the range of 6 to 11 bits. It is shown that the sigmoid output bit calculators of 7- and 8-bit inputs occupy just a single programmable logic block and make calculations in the shortest time. The proposed variant of the sigmoid function calculator can be used as aApart of trained neural networks implemented in hardware.
引用
收藏
页数:144
相关论文
共 50 条
  • [31] All-optical recurrent neural network with sigmoid activation function
    Mourgias-Alexandris, George
    Dabos, George
    Passalis, Nikolaos
    Tefas, Anastasios
    Totovic, Angelina
    Pleros, Nikos
    2020 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXPOSITION (OFC), 2020,
  • [32] Gaussian Activation Function Realization with Application to the Neural Network Implementations
    Yildiz, Hacer Atar
    2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [33] Neural Networks Probability-Based PWL Sigmoid Function Approximation
    Nguyen, Vantruong
    Cai, Jueping
    Wei, Linyu
    Chu, Jie
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (09) : 2023 - 2026
  • [34] A digital circuit design of hyperbolic tangent sigmoid function for neural networks
    Lin, Che-Wei
    Wang, Jeen-Shing
    PROCEEDINGS OF 2008 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-10, 2008, : 856 - 859
  • [35] Neural networks with asymmetric activation function for function approximation
    Gomes, Gecynalda S. da S.
    Ludermir, Teresa B.
    Almeida, Leandro M.
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2310 - 2317
  • [36] Low-Voltage Realization of Neural Networks using Non-Monotonic Activation Function for Digital Applications
    Khanday, Farooq Ahmad
    Kant, Nasir Ali
    Dar, Mohammad Rafiq
    RECENT ADVANCES IN ELECTRICAL & ELECTRONIC ENGINEERING, 2018, 11 (03) : 367 - 375
  • [37] Dynamic analysis of Hopfield neural network with Sigmoid-type activation function
    Zhang, Yunzhen
    Yin, Xiaowei
    Chen, Chengjie
    Zhao, Guangzhe
    You, Yunkai
    Tao, Shaohua
    Xiong, Weihua
    PHYSICA SCRIPTA, 2025, 100 (03)
  • [38] Adaptive Morphing Activation Function for Neural Networks
    Herrera-Alcantara, Oscar
    Arellano-Balderas, Salvador
    FRACTAL AND FRACTIONAL, 2024, 8 (08)
  • [39] Multistability of neural networks with discontinuous activation function
    Huang, Gan
    Cao, Jinde
    COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2008, 13 (10) : 2279 - 2289
  • [40] Parabola As an Activation Function of Artificial Neural Networks
    Khachumov, M. V.
    Emelyanova, Yu. G.
    SCIENTIFIC AND TECHNICAL INFORMATION PROCESSING, 2024, 51 (05) : 471 - 477