NOISY TRAINING FOR DEEP NEURAL NETWORKS

被引:0
|
作者
Meng, Xiangtao [1 ,2 ]
Liu, Chao [1 ]
Zhang, Zhiyong [1 ]
Wang, Dong [1 ]
机构
[1] Tsinghua Univ, RIIT, CSLT, Beijing, Peoples R China
[2] CUPT, Sch Compute Sci & Technol, Warsaw, Poland
关键词
deep neural network; noise injection; robust speech recognition; INJECTION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks (DNN) have gained remarkable success in speech recognition, partially attributed to its flexibility in learning complex patterns of speech signals. This flexibility, however, may lead to serious over-fitting and hence miserable performance degradation in adverse environments such as those with high ambient noises. We propose a noisy training approach to tackle this problem: by injecting noises into the training speech intentionally and randomly, more generalizable DNN models can be learned. This 'noise injection' technique has been well-known to the neural computation community, however there is little knowledge if it would work for the DNN model which involves a highly complex objective function. The experiments presented in this paper confirm that the original assumptions of the noise injection approach largely holds when learning deep structures, and the noisy training may provide substantial performance improvement for DNN-based speech recognition.
引用
收藏
页码:16 / 20
页数:5
相关论文
共 50 条
  • [1] Noisy training for deep neural networks in speech recognition
    Shi Yin
    Chao Liu
    Zhiyong Zhang
    Yiye Lin
    Dong Wang
    Javier Tejedor
    Thomas Fang Zheng
    Yinguo Li
    [J]. EURASIP Journal on Audio, Speech, and Music Processing, 2015
  • [2] Noisy training for deep neural networks in speech recognition
    Yin, Shi
    Liu, Chao
    Zhang, Zhiyong
    Lin, Yiye
    Wang, Dong
    Tejedor, Javier
    Zheng, Thomas Fang
    Li, Yinguo
    [J]. EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, 2015, : 1 - 14
  • [3] Training Neural Networks on Noisy Data
    Rusiecki, Andrzej
    Kordos, Miroslaw
    Kaminski, Tomasz
    Gren, Krzysztof
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING ICAISC 2014, PT I, 2014, 8467 : 131 - 142
  • [4] Training Deep Neural Networks for Image Applications with Noisy Labels by Complementary Learning
    Zhou Y.
    Liu Y.
    Wang R.
    [J]. 2017, Science Press (54): : 2649 - 2659
  • [5] Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
    Zhang, Zhilu
    Sabuncu, Mert R.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Robust Training of Deep Neural Networks with Noisy Labels by Graph Label Propagation
    Nomura, Yuichiro
    Kurita, Takio
    [J]. FRONTIERS OF COMPUTER VISION, IW-FCV 2021, 2021, 1405 : 281 - 293
  • [7] Enabling Training of Neural Networks on Noisy Hardware
    Gokmen, Tayfun
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2021, 4
  • [8] Mutual calibration training: Training deep neural networks with noisy labels using dual-models
    Liu, Rui
    Liu, Yi
    Wang, Rui
    Zhou, Yucong
    [J]. COMPUTER VISION AND IMAGE UNDERSTANDING, 2021, 212
  • [9] DCBT-Net: Training Deep Convolutional Neural Networks With Extremely Noisy Labels
    Olimov, Bekhzod
    Kim, Jeonghong
    Paul, Anand
    [J]. IEEE ACCESS, 2020, 8 : 220482 - 220495
  • [10] Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
    Han, Bo
    Yao, Quanming
    Yu, Xingrui
    Niu, Gang
    Xu, Miao
    Hu, Weihua
    Tsang, Ivor W.
    Sugiyama, Masashi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31