Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

被引:90
|
作者
Jagtap, Ameya D. [1 ]
Shin, Yeonjong [1 ]
Kawaguchi, Kenji [2 ]
Karniadakis, George Em [1 ,3 ]
机构
[1] Brown Univ, Div Appl Math, 182 George St, Providence, RI 02912 USA
[2] Harvard Univ, Ctr Math Sci & Applicat, Cambridge, MA 02138 USA
[3] Brown Univ, Sch Engn, Providence, RI 02912 USA
关键词
Deep neural networks; Kronecker product; Rowdy activation functions; Gradient flow dynamics; physics-informed neural networks; Deep learning benchmarks; LEARNING FRAMEWORK;
D O I
10.1016/j.neucom.2021.10.036
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new type of neural networks, Kronecker neural networks (KNNs), that form a general framework for neural networks with adaptive activation functions. KNNs employ the Kronecker product, which provides an efficient way of constructing a very wide network while keeping the number of parameters low. Our theoretical analysis reveals that under suitable conditions, KNNs induce a faster decay of the loss than that by the feed-forward networks. This is also empirically verified through a set of computational examples. Furthermore, under certain technical assumptions, we establish global convergence of gradient descent for KNNs. As a specific case, we propose the Rowdy activation function that is designed to get rid of any saturation region by injecting sinusoidal fluctuations, which include trainable parameters. The proposed Rowdy activation function can be employed in any neural network architecture like feed-forward neural networks, Recurrent neural networks, Convolutional neural networks etc. The effectiveness of KNNs with Rowdy activation is demonstrated through various computational experiments including function approximation using feed-forward neural networks, solution inference of partial differential equations using the physics-informed neural networks, and standard deep learning benchmark problems using convolutional and fully-connected neural networks. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:165 / 180
页数:16
相关论文
共 50 条
  • [1] Smooth Function Approximation by Deep Neural Networks with General Activation Functions
    Ohn, Ilsang
    Kim, Yongdai
    [J]. ENTROPY, 2019, 21 (07)
  • [2] Adaptive activation functions in convolutional neural networks
    Qian, Sheng
    Liu, Hua
    Liu, Cheng
    Wu, Si
    Wong, Hau San
    [J]. NEUROCOMPUTING, 2018, 272 : 204 - 212
  • [3] Deep Neural Networks with Multistate Activation Functions
    Cai, Chenghao
    Xu, Yanyan
    Ke, Dengfeng
    Su, Kaile
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015
  • [4] Activation Functions and Their Characteristics in Deep Neural Networks
    Ding, Bin
    Qian, Huimin
    Zhou, Jun
    [J]. PROCEEDINGS OF THE 30TH CHINESE CONTROL AND DECISION CONFERENCE (2018 CCDC), 2018, : 1836 - 1841
  • [5] Adaptive Activation Functions for Skin Lesion Classification Using Deep Neural Networks
    Namozov, Abdulaziz
    Ergashev, Dilshod
    Cho, Young Im
    [J]. 2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 232 - 235
  • [6] Approximation rates for neural networks with general activation functions
    Siegel, Jonathan W.
    Xu, Jinchao
    [J]. NEURAL NETWORKS, 2020, 128 : 313 - 321
  • [7] The loss surfaces of neural networks with general activation functions
    Baskerville, Nicholas P.
    Keating, Jonathan P.
    Mezzadri, Francesco
    Najnudel, Joseph
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (06):
  • [8] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166
  • [9] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    [J]. IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [10] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404