Improving Batch Normalization with Skewness Reduction for Deep Neural Networks

被引:2
|
作者
Ding, Pak Lun Kevin [1 ]
Martin, Sarah [1 ]
Li, Baoxin [1 ]
机构
[1] Arizona State Univ, Sch Comp Informat & Decis Syst Engn, Tempe, AZ 85281 USA
来源
2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR) | 2021年
关键词
D O I
10.1109/ICPR48806.2021.9412949
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Batch Normalization (BN) is a well-known technique used in training deep neural networks. The main idea behind batch normalization is to normalize the features of the layers (i.e., transforming them to have a mean equal to zero and a variance equal to one). Such a procedure encourages the optimization landscape of the loss function to be smoother, and improves the learning of the networks for both speed and performance. In this paper, we demonstrate that the performance of the network can be improved, if the distributions of the features of the output in the same layer are similar. As normalizing based on mean and variance does not necessarily make the features to have the same distribution, we propose a new normalization scheme: Batch Normalization with Skewness Reduction (BNSR). Comparing with other normalization approaches, BNSR transforms not just only the mean and variance, but also the skewness of the data. By tackling this property of a distribution, we are able to make the output distributions of the layers to be further similar. The nonlinearity of BNSR may further improve the expressiveness of the underlying network. Comparisons with other normalization schemes are tested on the CIFAR-100 and ImageNet datasets. Experimental results show that the proposed approach can outperform other state-of-the-arts that are not equipped with BNSR.
引用
收藏
页码:7165 / 7172
页数:8
相关论文
共 50 条
  • [1] Generalized Batch Normalization: Towards Accelerating Deep Neural Networks
    Yuan, Xiaoyong
    Feng, Zheng
    Norton, Matthew
    Li, Xiaolin
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1682 - 1689
  • [2] On Centralization and Unitization of Batch Normalization for Deep ReLU Neural Networks
    Fei, Wen
    Dai, Wenrui
    Li, Chenglin
    Zou, Junni
    Xiong, Hongkai
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 2827 - 2841
  • [3] Deep Neural Networks with Batch Speaker Normalization for Intoxicated Speech Detection
    Wang, Weiqing
    Wu, Haiwei
    Li, Ming
    2019 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2019, : 1323 - 1327
  • [4] Transferable Normalization: Towards Improving Transferability of Deep Neural Networks
    Wang, Ximei
    Jin, Ying
    Long, Mingsheng
    Wang, Jianmin
    Jordan, Michael I.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [5] Regularizing deep neural networks for medical image analysis with augmented batch normalization
    Zhu, Shengqian
    Yu, Chengrong
    Hu, Junjie
    APPLIED SOFT COMPUTING, 2024, 154
  • [6] Batch Normalization and Dropout Regularization in Training Deep Neural Networks with Label Noise
    Rusiecki, Andrzej
    INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, ISDA 2021, 2022, 418 : 57 - 66
  • [7] Riemannian batch normalization for SPD neural networks
    Brooks, Daniel
    Schwander, Olivier
    Barbaresco, Frederic
    Schneider, Jean-Yves
    Cord, Matthieu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [8] L1-Norm Batch Normalization for Efficient Training of Deep Neural Networks
    Wu, Shuang
    Li, Guoqi
    Deng, Lei
    Liu, Liu
    Wu, Dong
    Xie, Yuan
    Shi, Luping
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (07) : 2043 - 2051
  • [9] Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks
    Xu, Xiaozhou
    Chen, Qiming
    Xie, Lei
    Su, Hongye
    16TH IEEE INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2020), 2020, : 951 - 956
  • [10] NORMALIZATION EFFECTS ON DEEP NEURAL NETWORKS
    Yu, Jiahui
    Spiliopoulos, Konstantinos
    FOUNDATIONS OF DATA SCIENCE, 2023, 5 (03): : 389 - 465