Efficient VLSI Implementation of Neural Networks With Hyperbolic Tangent Activation Function

被引:99
|
作者
Zamanlooy, Babak [1 ]
Mirhassani, Mitra [1 ]
机构
[1] Univ Windsor, Dept Elect & Comp Engn, Windsor, ON N9B 3P4, Canada
关键词
Hyperbolic tangent; neural networks; nonlinear activation function; VLSI implementation; SIGMOID FUNCTION; HARDWARE IMPLEMENTATION; GENERATORS; DESIGN;
D O I
10.1109/TVLSI.2012.2232321
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Nonlinear activation function is one of the main building blocks of artificial neural networks. Hyperbolic tangent and sigmoid are the most used nonlinear activation functions. Accurate implementation of these transfer functions in digital networks faces certain challenges. In this paper, an efficient approximation scheme for hyperbolic tangent function is proposed. The approximation is based on a mathematical analysis considering the maximum allowable error as design parameter. Hardware implementation of the proposed approximation scheme is presented, which shows that the proposed structure compares favorably with previous architectures in terms of area and delay. The proposed structure requires less output bits for the same maximum allowable error when compared to the state-of-the-art. The number of output bits of the activation function determines the bit width of multipliers and adders in the network. Therefore, the proposed activation function results in reduction in area, delay, and power in VLSI implementation of artificial neural networks with hyperbolic tangent activation function.
引用
收藏
页码:39 / 48
页数:10
相关论文
共 50 条
  • [41] Anti-synchronization of a M-Hopfield neural network with generalized hyperbolic tangent activation function
    Viera-Martin, E.
    Gomez-Aguilar, J. F.
    Solis-Perez, J. E.
    Hernandez-Perez, J. A.
    Olivares-Peregrino, V. H.
    EUROPEAN PHYSICAL JOURNAL-SPECIAL TOPICS, 2022, 231 (10): : 1801 - 1814
  • [42] LINEAR-PHASE NETWORKS WITH HYPERBOLIC TANGENT FUNCTION AND THEIR APPLICATIONS
    FURUHATA, T
    OWASHI, H
    TAKAHASHI, H
    YUMDE, Y
    IEEE TRANSACTIONS ON BROADCASTING, 1986, 32 (03) : 62 - 69
  • [43] An extended class of synaptic operators with application for efficient VLSI implementation of cellular neural networks
    Dogaru, R
    Crounse, KR
    Chua, LO
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 1998, 45 (07) : 745 - 753
  • [44] Programmable analogue VLSI implementation for asymmetric sigmoid neural activation function and its derivative
    Tabarce, S
    Tavares, VG
    de Oliveira, PG
    ELECTRONICS LETTERS, 2005, 41 (15) : 863 - 864
  • [45] High accuracy FPGA activation function implementation for neural networks
    Hajduk, Zbigniew
    NEUROCOMPUTING, 2017, 247 : 59 - 61
  • [46] Integrals of hyperbolic tangent function
    Li, Jing
    Chu, Wenchang
    DISCRETE MATHEMATICS LETTERS, 2024, 13 : 89 - 94
  • [47] Abstract Univariate Neural Network Approximation Using a q-Deformed and λ-Parametrized Hyperbolic Tangent Activation Function
    Anastassiou, George A.
    FRACTAL AND FRACTIONAL, 2023, 7 (03)
  • [48] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    SYMMETRY-BASEL, 2022, 14 (05):
  • [49] Analog circuit for synapse neural networks VLSI implementation
    Chible, H
    ICECS 2000: 7TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS & SYSTEMS, VOLS I AND II, 2000, : 1004 - 1007
  • [50] VLSI approach to the implementation of additive and shunting neural networks
    Pelayo, FJ
    MartinSmith, P
    Fernandez, FJ
    Prieto, A
    FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 728 - 735