Uniform Convergence of Deep Neural Networks With Lipschitz Continuous Activation Functions and Variable Widths

被引:0
|
作者
Xu, Yuesheng [1 ]
Zhang, Haizhang [2 ]
机构
[1] Old Dominion Univ, Dept Math & Stat, Norfolk, VA 23529 USA
[2] Sun Yat Sen Univ, Sch Math Zhuhai, Zhuhai 519082, Peoples R China
基金
美国国家科学基金会; 美国国家卫生研究院; 中国国家自然科学基金;
关键词
Convergence; Vectors; Artificial neural networks; Kernel; Training; Deep learning; Uniform convergence; deep neural networks; convolutional neural networks; Lipschitz continuous activation functions; variable widths; RELU NETWORKS; ERROR-BOUNDS;
D O I
10.1109/TIT.2024.3439136
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider deep neural networks (DNNs) with a Lipschitz continuous activation function and with weight matrices of variable widths. We establish a uniform convergence analysis framework in which sufficient conditions on weight matrices and bias vectors together with the Lipschitz constant are provided to ensure uniform convergence of DNNs to a meaningful function as the number of their layers tends to infinity. In the framework, special results on uniform convergence of DNNs with a fixed width, bounded widths and unbounded widths are presented. In particular, as convolutional neural networks are special DNNs with weight matrices of increasing widths, we put forward conditions on the mask sequence which lead to uniform convergence of the resulting convolutional neural networks. The Lipschitz continuity assumption on the activation functions allows us to include in our theory most of commonly used activation functions in applications.
引用
收藏
页码:7125 / 7142
页数:18
相关论文
共 50 条
  • [41] AutoShuffleNet: Learning Permutation Matrices via an Exact Lipschitz Continuous Penalty in Deep Convolutional Neural Networks
    Lyu, Jiancheng
    Zhang, Shuai
    Qi, Yingyong
    Xin, Jack
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 608 - 616
  • [42] Global exponential convergence for delayed cellular neural networks with a class of general activation functions
    Shao, Jianying
    NONLINEAR ANALYSIS-REAL WORLD APPLICATIONS, 2009, 10 (03) : 1816 - 1821
  • [43] Almost periodic solutions for shunting inhibitory cellular neural networks without global Lipschitz activation functions
    Liu, Bingwen
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2007, 203 (01) : 159 - 168
  • [44] Stability analysis of almost periodic solutions for delayed neural networks without global Lipschitz activation functions
    Zhou, Jun
    Zhao, Weirui
    Lv, Xiaohong
    Zhu, Huaping
    MATHEMATICS AND COMPUTERS IN SIMULATION, 2011, 81 (11) : 2440 - 2455
  • [46] Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions
    Alkhouly, Asmaa A.
    Mohammed, Ammar
    Hefny, Hesham A.
    IEEE ACCESS, 2021, 9 : 82249 - 82271
  • [47] Adaptive Activation Functions for Skin Lesion Classification Using Deep Neural Networks
    Namozov, Abdulaziz
    Ergashev, Dilshod
    Cho, Young Im
    2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 232 - 235
  • [48] Comparison and Evaluation of Activation Functions in Term of Gradient Instability in Deep Neural Networks
    Liu, Xin
    Zhou, Jun
    Qin, Huimin
    PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 3966 - 3971
  • [49] Deep neural networks with a set of node-wise varying activation functions
    Jang, Jinhyeok
    Cho, Hyunjoong
    Kim, Jaehong
    Lee, Jaeyeon
    Yang, Seungjoon
    NEURAL NETWORKS, 2020, 126 : 118 - 131
  • [50] Rethinking the Role of Activation Functions in Deep Convolutional Neural Networks for Image Classification
    Zheng, Qinghe
    Yang, Mingqiang
    Tian, Xinyu
    Wang, Xiaochen
    Wang, Deqiang
    ENGINEERING LETTERS, 2020, 28 (01) : 80 - 92