ReAFM: A Reconfigurable Nonlinear Activation Function Module for Neural Networks

被引:2
|
作者
Wu, Xiao [1 ]
Liang, Shuang [1 ]
Wang, Meiqi [1 ]
Wang, Zhongfeng [1 ]
机构
[1] Nanjing Univ, Sch Elect Sci & Engn, Nanjing 210093, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep neural network; activation functions; FPGA; VLSI; hardware architecture; HARDWARE IMPLEMENTATION;
D O I
10.1109/TCSII.2023.3241487
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep neural networks (DNNs) with various nonlinear activation functions (NAFs) have achieved unprecedented successes, sparking interest in efficient DNN hardware implementation. However, most existing NAF implementations focus on one type of functions with dedicated architectures, which is not suited for supporting versatile DNN accelerators. In this brief, based on a proposed reciprocal approximation optimization (RAO) method, an efficient reconfigurable nonlinear activation function module (ReAFM) is devised to implement various NAFs. The computational logic and dataflow of certain NAFs are merged and reused to minimize hardware consumption by leveraging the correlations among different NAFs. In addition, a precision adjustable exponential unit (PAEU) is developed to obtain a good tradeoff between the approximation accuracy and hardware cost. Compared to the prior art, the experimental results demonstrate that the proposed ReAFM can support many more NAF types with comparable or even better performance. Furthermore, evaluation results on some prevalent neural networks show that the proposed approximation method causes negligible accuracy loss (< 0.1%).
引用
收藏
页码:2660 / 2664
页数:5
相关论文
共 50 条
  • [41] Digitally programmable nonlinear function generator for neural networks
    Sargeni, F
    Bonaiuto, V
    ELECTRONICS LETTERS, 2005, 41 (03) : 143 - 145
  • [42] Computation of a nonlinear squashing function in digital neural networks
    Havel, Vladimir
    Vlcek, Karel
    2008 IEEE WORKSHOP ON DESIGN AND DIAGNOSTICS OF ELECTRONIC CIRCUITS AND SYSTEMS, PROCEEDINGS, 2008, : 18 - 21
  • [43] PWL-Explorer: A Reconfigurable Architecture for Nonlinear Activation Function with Automatic DSE
    Mao, Yiqing
    Kuang, Huizhen
    Luk, Wai-Shing
    Wang, Lingli
    2024 INTERNATIONAL SYMPOSIUM OF ELECTRONICS DESIGN AUTOMATION, ISEDA 2024, 2024, : 210 - 215
  • [44] An adaptive activation function for multilayer feedforward neural networks
    Yu, CC
    Tang, YC
    Liu, BD
    2002 IEEE REGION 10 CONFERENCE ON COMPUTERS, COMMUNICATIONS, CONTROL AND POWER ENGINEERING, VOLS I-III, PROCEEDINGS, 2002, : 645 - 650
  • [45] Artificial Neural Networks Activation Function HDL Coder
    Namin, Ashkan Hosseinzadeh
    Leboeuf, Karl
    Wu, Huapeng
    Ahmadi, Majid
    2009 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY, 2009, : 387 - 390
  • [46] FPGA Realization of Activation Function for Artificial Neural Networks
    Saichand, Venakata
    Nirmala, Devi M.
    Arumugam., S.
    Mohankumar, N.
    ISDA 2008: EIGHTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 3, PROCEEDINGS, 2008, : 159 - 164
  • [47] Differentially Private Neural Networks with Bounded Activation Function
    Jung, Kijung
    Lee, Hyukki
    Chung, Yon Dohn
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2021, E104D (06) : 905 - 908
  • [48] Enhancement of neural networks with an alternative activation function tanhLU
    Shen, Shui-Long
    Zhang, Ning
    Zhou, Annan
    Yin, Zhen-Yu
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 199
  • [49] Stochastic Implementation of the Activation Function for Artificial Neural Networks
    Yeo, Injune
    Gi, Sang-gyun
    Lee, Byung-geun
    Chu, Myonglae
    PROCEEDINGS OF 2016 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS), 2016, : 440 - 443
  • [50] A Hybrid Chaotic Activation Function for Artificial Neural Networks
    Reid, Siobhan
    Ferens, Ken
    ADVANCES IN ARTIFICIAL INTELLIGENCE AND APPLIED COGNITIVE COMPUTING, 2021, : 1097 - 1105