Realization of the sigmoid activation function for neural networks on current FPGAs by the table-driven method

被引:0
|
作者
V. Ushenina, Inna [1 ]
机构
[1] Penza State Technol Univ, Penza, Russia
基金
俄罗斯科学基金会;
关键词
neural network; sigmoid function; FPGA; table-driven method; IMPLEMENTATION; APPROXIMATION;
D O I
10.17223/19988605/69/13
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the work, the sigmoid function is implemented using the bit-level mapping method. Within this method, inputs and outputs of the sigmoid function are represented in binary code in fixed-point format. Each output bit is separated from others and represented by a Boolean function of the input bits or its truth table. The possibilities of implementing sigmoid function output bit calculators on FPGA programmable logic blocks are assessed. Two implementation ways are analyzed: on the base of truth tables and on the base of minimized Boolean functions. All implemented circuits have equal bit widths of inputs and outputs to each other. The circuits based on truth tables have bit widths in the range of 6 to 11 bits. It is shown that the sigmoid output bit calculators of 7- and 8-bit inputs occupy just a single programmable logic block and make calculations in the shortest time. The proposed variant of the sigmoid function calculator can be used as aApart of trained neural networks implemented in hardware.
引用
收藏
页数:144
相关论文
共 50 条
  • [1] FPGA Realization of Activation Function for Artificial Neural Networks
    Saichand, Venakata
    Nirmala, Devi M.
    Arumugam., S.
    Mohankumar, N.
    ISDA 2008: EIGHTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 3, PROCEEDINGS, 2008, : 159 - 164
  • [2] Protecting health data in the cloud through steganography: A table-driven, blind method using neural networks and bit-shuffling algorithm
    Nahar, Mahbubun
    Kamal, A. H. M.
    Hossain, Gahangir
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2023, 217
  • [3] An optimized Lookup-Table for the Evaluation of Sigmoid Function for Artificial Neural Networks
    Meher, Pramod Kumar
    PROCEEDINGS OF THE 2010 18TH IEEE/IFIP INTERNATIONAL CONFERENCE ON VLSI AND SYSTEM-ON-CHIP, 2010, : 91 - 95
  • [4] Efficient Neural Networks on the Edge with FPGAs by Optimizing an Adaptive Activation Function
    Jiang, Yiyue
    Vaicaitis, Andrius
    Dooley, John
    Leeser, Miriam
    SENSORS, 2024, 24 (06)
  • [5] Approximating smooth functions by deep neural networks with sigmoid activation function
    Langer, Sophie
    JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 182
  • [6] TABLE-DRIVEN IMPLEMENTATION OF THE LOGARITHM FUNCTION IN IEEE FLOATING-POINT ARITHMETIC
    TANG, PTP
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1990, 16 (04): : 378 - 400
  • [7] TABLE-DRIVEN IMPLEMENTATION OF THE EXPONENTIAL FUNCTION IN IEEE FLOATING-POINT ARITHMETIC
    TANG, PTP
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 1989, 15 (02): : 144 - 157
  • [8] A study on neural networks using Taylor series expansion of sigmoid activation function
    Temurtas, F
    Gulbag, A
    Yumusak, N
    COMPUTATIONAL SCIENCE AND ITS APPLICATIONS - ICCSA 2004, PT 4, 2004, 3046 : 389 - 397
  • [9] Realization of limit cycles by neural networks with piecewise linear activation function
    Takahashi, N
    Yamakawa, T
    Nishi, T
    PROCEEDINGS OF THE 2005 EUROPEAN CONFERENCE ON CIRCUIT THEORY AND DESIGN, VOL 3, 2005, : 7 - 10
  • [10] Voltage-to-Voltage Sigmoid Neuron Activation Function Design for Artificial Neural Networks
    Moposita, Tatiana
    Trojman, Lionel
    Crupi, Felice
    Lanuzza, Marco
    Vladimirescu, Andrei
    2022 IEEE 13TH LATIN AMERICAN SYMPOSIUM ON CIRCUITS AND SYSTEMS (LASCAS), 2022, : 164 - 167