ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximation

被引:1
|
作者
Tang, Shanshan [1 ]
Li, Bo [2 ]
Yu, Haijun [3 ,4 ]
机构
[1] Ind & Commercial Bank China, Software Dev Ctr, 16 Bldg ZhongGuanCun Software Pk, Beijing 100193, Peoples R China
[2] Huawei Technol Co Ltd, Hisilicon Semicond & Component Business Dept, Labs 2012, Bantian St, Shenzhen 518129, Peoples R China
[3] Acad Math & Syst Sci, Inst Computat Math & Sci Engn Comp, NCMIS & LSEC, Beijing 100190, Peoples R China
[4] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
关键词
Deep neural networks; Rectified power units; Chebyshev polynomial; High-dimensional approximation; Stability; Constructive spectral approximation; SPARSE GRID METHODS; DIMENSIONAL PROBLEMS; SMOOTH FUNCTIONS; ELEMENT METHOD; ERROR-BOUNDS; INTERPOLATION; ALGORITHM; EQUATIONS; ACCURACY;
D O I
10.1007/s40304-023-00392-0
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In a previous study by Li et al. (Commun Comput Phys 27(2):379-411, 2020), it is shown that deep neural networks built with rectified power units (RePU) as activation functions can give better approximation for sufficient smooth functions than those built with rectified linear units, by converting polynomial approximations using power series into deep neural networks with optimal complexity and no approximation error. However, in practice, power series approximations are not easy to obtain due to the associated stability issue. In this paper, we propose a new and more stable way to construct RePU deep neural networks based on Chebyshev polynomial approximations. By using a hierarchical structure of Chebyshev polynomial approximation in frequency domain, we obtain efficient and stable deep neural network construction, which we call ChebNet. The approximation of smooth functions by ChebNets is no worse than the approximation by deep RePU nets using power series. On the same time, ChebNets are much more stable. Numerical results show that the constructed ChebNets can be further fine-tuned to obtain much better results than those obtained by tuning deep RePU nets constructed by power series approach. As spectral accuracy is hard to obtain by direct training of deep neural networks, ChebNets provide a practical way to obtain spectral accuracy, it is expected to be useful in real applications that require efficient approximations of smooth functions.
引用
收藏
页数:27
相关论文
共 50 条
  • [21] Energy-Efficient Power Control in Wireless Networks With Spatial Deep Neural Networks
    Zhang, Ticao
    Mao, Shiwen
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2020, 6 (01) : 111 - 124
  • [22] ONLINE ENERGY-EFFICIENT POWER CONTROL IN WIRELESS NETWORKS BY DEEP NEURAL NETWORKS
    Zappone, Alessio
    Debbah, M'erouane
    Altman, Zwi
    2018 IEEE 19TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC), 2018, : 136 - 140
  • [23] POWER SYSTEM STATE FORECASTING VIA DEEP RECURRENT NEURAL NETWORKS
    Zhang, Liang
    Wang, Gang
    Giannakis, Georgios B.
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 8092 - 8096
  • [24] Towards Optimal Power Control via Ensembling Deep Neural Networks
    Liang, Fei
    Shen, Cong
    Yu, Wei
    Wu, Feng
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2020, 68 (03) : 1760 - 1776
  • [25] EFFICIENT POWER ALLOCATION USING GRAPH NEURAL NETWORKS AND DEEP ALGORITHM UNFOLDING
    Chowdhury, Arindam
    Verma, Gunjan
    Rao, Chirag
    Swami, Ananthram
    Segarra, Santiago
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4725 - 4729
  • [26] Efficient generation of valid test inputs for deep neural networks via gradient search
    Jiang, Zhouxian
    Li, Honghui
    Wang, Rui
    JOURNAL OF SOFTWARE-EVOLUTION AND PROCESS, 2024, 36 (04)
  • [27] Zero-Keep Filter Pruning for Energy/Power Efficient Deep Neural Networks
    Woo, Yunhee
    Kim, Dongyoung
    Jeong, Jaemin
    Ko, Young-Woong
    Lee, Jeong-Gun
    ELECTRONICS, 2021, 10 (11)
  • [28] A Power Efficient Multi-Bit Accelerator for Memory Prohibitive Deep Neural Networks
    Shivapakash, Suhas
    Jain, Hardik
    Hellwich, Olaf
    Gerfers, Friedel
    2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,
  • [29] Accurate photovoltaic power prediction models based on deep convolutional neural networks and gated recurrent units
    Sabri, N. Mohammed
    El Hassouni, Mohammed
    ENERGY SOURCES PART A-RECOVERY UTILIZATION AND ENVIRONMENTAL EFFECTS, 2022, 44 (03) : 6303 - 6320
  • [30] Power Control for Interference Management via Ensembling Deep Neural Networks (Invited Paper)
    Liang, Fei
    Shen, Cong
    Yu, Wei
    Wu, Feng
    2019 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC), 2019,