Uniform Convergence of Deep Neural Networks With Lipschitz Continuous Activation Functions and Variable Widths

被引:0
|
作者
Xu, Yuesheng [1 ]
Zhang, Haizhang [2 ]
机构
[1] Old Dominion Univ, Dept Math & Stat, Norfolk, VA 23529 USA
[2] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Peoples R China
基金
美国国家科学基金会; 美国国家卫生研究院; 中国国家自然科学基金;
关键词
Convergence; Vectors; Artificial neural networks; Kernel; Training; Deep learning; Uniform convergence; deep neural networks; convolutional neural networks; Lipschitz continuous activation functions; variable widths; RELU NETWORKS; ERROR-BOUNDS;
D O I
10.1109/TIT.2024.3439136
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider deep neural networks (DNNs) with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors together with the Lipschitz constant are provided to ensure uniform convergence of DNNs to a meaningful function as the number of their layers tends to infinity. In the framework, special results on uniform convergence of DNNs with a fixed width, bounded widths and unbounded widths are presented. In particular, as convolutional neural networks are special DNNs with weight matrices of increasing widths, we put forward conditions on the mask sequence which lead to uniform convergence of the resulting convolutional neural networks. The Lipschitz continuity assumption on the activation functions allows us to include in our theory most of commonly used activation functions in applications.
引用
收藏
页码:7125 / 7142
页数:18
相关论文
共 50 条
  • [1] On exponential stability of delayed neural networks with globally Lipschitz continuous activation functions
    Sun, CY
    Feng, CB
    PROCEEDINGS OF THE 4TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-4, 2002, : 1953 - 1957
  • [2] On the absolute exponential stability of neural networks with globally Lipschitz continuous activation functions
    Liang, XB
    Yamaguchi, T
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1997, E80D (06) : 687 - 690
  • [3] Approximating Lipschitz continuous functions with GroupSort neural networks
    Tanielian, U.
    Sangnier, M.
    Biau, G.
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 442 - +
  • [4] Non-uniform Piecewise Linear Activation Functions in Deep Neural Networks
    Zhu, Zezhou
    Dong, Yuan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2107 - 2113
  • [5] Exponential stability of delayed and impulsive cellular neural networks with partially Lipschitz continuous activation functions
    Song, Xueli
    Xin, Xing
    Huang, Wenpo
    NEURAL NETWORKS, 2012, 29-30 : 80 - 90
  • [6] Approximation of Lipschitz Functions Using Deep Spline Neural Networks*
    Neumayer, Sebastian
    Goujon, Alexis
    Bohra, Pakshal
    Unser, Michael
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (02): : 306 - 322
  • [7] On global exponential stability of cellular neural networks with Lipschitz-continuous activation function and variable delays
    Zhou, DM
    Zhang, LM
    Cao, J
    APPLIED MATHEMATICS AND COMPUTATION, 2004, 151 (02) : 379 - 392
  • [8] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
  • [9] Absolute exponential stability of recurrent neural networks with Lipschitz-continuous activation functions and time delays
    Cao, JD
    Wang, J
    NEURAL NETWORKS, 2004, 17 (03) : 379 - 390
  • [10] HOLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
    Tatar, Nasser-Eddine
    ADVANCES IN DIFFERENTIAL EQUATIONS AND CONTROL PROCESSES, 2015, 15 (02): : 93 - 106