Efficient Neural Networks on the Edge with FPGAs by Optimizing an Adaptive Activation Function

被引:2
|
作者
Jiang, Yiyue [1 ]
Vaicaitis, Andrius [2 ]
Dooley, John [2 ]
Leeser, Miriam [1 ]
机构
[1] Northeastern Univ, Dept Elect & Comp Engn, Boston, MA 02115 USA
[2] Maynooth Univ, Dept Elect Engn, Maynooth W23 F2H6, Ireland
关键词
adaptive activation function (AAF); neural network; FPGA; deep learning; digital predistortion; DIGITAL PREDISTORTION; MODEL;
D O I
10.3390/s24061829
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The implementation of neural networks (NNs) on edge devices enables local processing of wireless data, but faces challenges such as high computational complexity and memory requirements when deep neural networks (DNNs) are used. Shallow neural networks customized for specific problems are more efficient, requiring fewer resources and resulting in a lower latency solution. An additional benefit of the smaller network size is that it is suitable for real-time processing on edge devices. The main concern with shallow neural networks is their accuracy performance compared to DNNs. In this paper, we demonstrate that a customized adaptive activation function (AAF) can meet the accuracy of a DNN. We designed an efficient FPGA implementation for a customized segmented spline curve neural network (SSCNN) structure to replace the traditional fixed activation function with an AAF. We compared our SSCNN with different neural network structures such as a real-valued time-delay neural network (RVTDNN), an augmented real-valued time-delay neural network (ARVTDNN), and deep neural networks with different parameters. Our proposed SSCNN implementation uses 40% fewer hardware resources and no block RAMs compared to the DNN with similar accuracy. We experimentally validated this computationally efficient and memory-saving FPGA implementation of the SSCNN for digital predistortion of radio-frequency (RF) power amplifiers using the AMD/Xilinx RFSoC ZCU111. The implemented solution uses less than 3% of the available resources. The solution also enables an increase of the clock frequency to 221.12 MHz, allowing the transmission of wide bandwidth signals.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Customisable Processing of Neural Networks for FPGAs
    Denholm, Stewart
    Luk, Wayne
    THE PROCEEDINGS OF THE 13TH INTERNATIONAL SYMPOSIUM ON HIGHLY EFFICIENT ACCELERATORS AND RECONFIGURABLE TECHNOLOGIES, HEART 2023, 2023, : 69 - 77
  • [42] Multistability of neural networks with discontinuous activation function
    Huang, Gan
    Cao, Jinde
    COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2008, 13 (10) : 2279 - 2289
  • [43] Parabola As an Activation Function of Artificial Neural Networks
    Khachumov, M. V.
    Emelyanova, Yu. G.
    SCIENTIFIC AND TECHNICAL INFORMATION PROCESSING, 2024, 51 (05) : 471 - 477
  • [44] Activation function of wavelet chaotic neural networks
    Xu, Yao-Qun
    Sun, Ming
    Guo, Meng-Shu
    PROCEEDINGS OF THE FIFTH IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS, VOLS 1 AND 2, 2006, : 716 - 721
  • [45] Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
    Jagtap, Ameya D.
    Shin, Yeonjong
    Kawaguchi, Kenji
    Karniadakis, George Em
    NEUROCOMPUTING, 2022, 468 (165-180) : 165 - 180
  • [46] DYNAMICS OF NEURAL NETWORKS WITH NONMONOTONE ACTIVATION FUNCTION
    DEFELICE, P
    MARANGI, C
    NARDULLI, G
    PASQUARIELLO, G
    TEDESCO, L
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1993, 4 (01) : 1 - 9
  • [47] Activation function of transiently chaotic neural networks
    Xu, Yaoqun
    Sun, Ming
    Duan, Guangren
    WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 3004 - +
  • [48] An Improved Fault Diagnosis Approach for Pumps Based on Neural Networks with Improved Adaptive Activation Function
    Zhang, Fangfang
    Li, Yebin
    Shan, Dongri
    Liu, Yuanhong
    Ma, Fengying
    PROCESSES, 2023, 11 (09)
  • [49] Bi-modal derivative adaptive activation function sigmoidal feedforward artificial neural networks
    Mishra, Akash
    Chandra, Pravin
    Ghose, Udayan
    Sodhi, Sartaj Singh
    APPLIED SOFT COMPUTING, 2017, 61 : 983 - 994
  • [50] A novel type of activation function in artificial neural networks: Trained activation function
    Ertugrul, Omer Faruk
    NEURAL NETWORKS, 2018, 99 : 148 - 157