Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

被引:90
|
作者
Jagtap, Ameya D. [1 ]
Shin, Yeonjong [1 ]
Kawaguchi, Kenji [2 ]
Karniadakis, George Em [1 ,3 ]
机构
[1] Brown Univ, Div Appl Math, 182 George St, Providence, RI 02912 USA
[2] Harvard Univ, Ctr Math Sci & Applicat, Cambridge, MA 02138 USA
[3] Brown Univ, Sch Engn, Providence, RI 02912 USA
关键词
Deep neural networks; Kronecker product; Rowdy activation functions; Gradient flow dynamics; physics-informed neural networks; Deep learning benchmarks; LEARNING FRAMEWORK;
D O I
10.1016/j.neucom.2021.10.036
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping the number of parameters low. Our theoretical analysis reveals that under suitable conditions, KNNs induce a faster decay of the loss than that by the feed-forward networks. This is also empirically verified through a set of computational examples. Furthermore, under certain technical assumptions, we establish global convergence of gradient descent for KNNs. As a specific case, we propose the Rowdy activation function that is designed to get rid of any saturation region by injecting sinusoidal fluctuations, which include trainable parameters. The proposed Rowdy activation function can be employed in any neural network architecture like feed-forward neural networks, Recurrent neural networks, Convolutional neural networks etc. The effectiveness of KNNs with Rowdy activation is demonstrated through various computational experiments including function approximation using feed-forward neural networks, solution inference of partial differential equations using the physics-informed neural networks, and standard deep learning benchmark problems using convolutional and fully-connected neural networks. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:165 / 180
页数:16
相关论文
共 50 条
  • [21] On transformative adaptive activation functions in neural networks for gene expression inference
    Kunc, Vladimir
    Klema, Jiri
    [J]. PLOS ONE, 2021, 16 (01):
  • [22] Absolute exponential stability of neural networks with a general class of activation functions
    Liang, XB
    Wang, J
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-FUNDAMENTAL THEORY AND APPLICATIONS, 2000, 47 (08): : 1258 - 1263
  • [23] On exponential stability of delayed neural networks with a general class of activation functions
    Sun, CY
    Zhang, KJ
    Fei, SM
    Feng, CB
    [J]. PHYSICS LETTERS A, 2002, 298 (2-3) : 122 - 132
  • [24] Simple Electromagnetic Analysis Against Activation Functions of Deep Neural Networks
    Takatoi, Go
    Sugawara, Takeshi
    Sakiyama, Kazuo
    Li, Yang
    [J]. APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2020, 2020, 12418 : 181 - 197
  • [25] Approximating smooth functions by deep neural networks with sigmoid activation function
    Langer, Sophie
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 182
  • [26] Using deep convolutional neural networks with adaptive activation functions for medical CT brain image Classification
    Zahedinasab, Roxana
    Mohseni, Hadis
    [J]. 2018 25TH IRANIAN CONFERENCE ON BIOMEDICAL ENGINEERING AND 2018 3RD INTERNATIONAL IRANIAN CONFERENCE ON BIOMEDICAL ENGINEERING (ICBME), 2018, : 315 - 320
  • [27] New results on the general decay synchronization of delayed neural networks with general activation functions
    Abdurahman, Abdujelil
    [J]. NEUROCOMPUTING, 2018, 275 : 2505 - 2511
  • [28] A general framework for encoding and evolving neural networks
    Kassahun, Yohannes
    Metzen, Jan Hendrik
    De Gea, Jose
    Edgington, Mark
    Kirchner, Frank
    [J]. KI 2007: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2007, 4667 : 205 - +
  • [29] Adaptive Test Selection for Deep Neural Networks
    Gao, Xinyu
    Feng, Yang
    Yin, Yining
    Liu, Zixi
    Chen, Zhenyu
    Xu, Baowen
    [J]. 2022 ACM/IEEE 44TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING (ICSE 2022), 2022, : 73 - 85
  • [30] A Unified Framework of Deep Neural Networks by Capsules
    Li, Yujian
    Shan, Chuanhui
    [J]. COGNITIVE SYSTEMS AND SIGNAL PROCESSING, PT II, 2019, 1006 : 231 - 242