Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

被引:0
|
作者
Xu, Yixi [1 ]
Wang, Xiao [1 ]
机构
[1] Purdue Univ, Dept Stat, W Lafayette, IN 47907 USA
关键词
CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a general framework for norm-based capacity control for L-p,(q) weight normalized deep neural networks. We establish the upper bound on the Rademacher complexities of this family. With an L-p,(q) normalization where q <= p* and 1/p +1/p* = 1, we discuss properties of a width-independent capacity control, which only depends on the depth by a square root term. We further analyze the approximation properties of L-p,(q) weight normalized deep neural networks. In particular, for an L-i,L-infinity weight normalized network, the approximation error can be controlled by the L-1 norm of the output layer, and the corresponding generalization error only depends on the architecture by the square root of the depth.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Deep neural networks with Elastic Rectified Linear Units for object recognition
    Jiang, Xiaoheng
    Pang, Yanwei
    Li, Xuelong
    Pan, Jing
    Xie, Yinghong
    [J]. NEUROCOMPUTING, 2018, 275 : 1132 - 1139
  • [2] Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
    Shang, Wenling
    Sohn, Kihyuk
    Almeida, Diogo
    Lee, Honglak
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [3] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT
    Dahl, George E.
    Sainath, Tara N.
    Hinton, Geoffrey E.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613
  • [4] Weight normalized deep neural networks
    Xu, Yixi
    Wang, Xiao
    [J]. STAT, 2021, 10 (01):
  • [5] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units
    Barushka, Aliaksandr
    Hajek, Petr
    [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
  • [6] Exploring Normalization in Deep Residual Networks with Concatenated Rectified Linear Units
    Shang, Wenling
    Chiu, Justin
    Sohn, Kihyuk
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1509 - 1516
  • [7] FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
    Qiu, Suo
    Xu, Xiangmin
    Cai, Bolun
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1223 - 1228
  • [8] Deep delay rectified neural networks
    Chuanhui Shan
    Ao Li
    Xiumei Chen
    [J]. The Journal of Supercomputing, 2023, 79 : 880 - 896
  • [9] Rectified Exponential Units for Convolutional Neural Networks
    Ying, Yao
    Su, Jianlin
    Shan, Peng
    Miao, Ligang
    Wang, Xiaolian
    Peng, Silong
    [J]. IEEE ACCESS, 2019, 7 : 101633 - 101640
  • [10] Deep delay rectified neural networks
    Shan, Chuanhui
    Li, Ao
    Chen, Xiumei
    [J]. JOURNAL OF SUPERCOMPUTING, 2023, 79 (01): : 880 - 896