Deep Neural Networks with Cascaded Output Layers

被引:0
|
作者
机构
[1] Cui, Hua
[2] Bai, Jie
[3] Bi, Xin
[4] Huang, Libo
来源
Bai, Jie (baijie@tongji.edu.cn) | 1600年 / Science Press卷 / 45期
关键词
AdaBoost algorithm - Design method - Generalization performance - Network structures - Novel design - Output layer - Sequential classifier;
D O I
10.11908/j.issn.0253-374x.2017.s1.004
中图分类号
学科分类号
摘要
This paper presented a novel design method for deep neural network aiming at promoting its generalization performance. First, the common methods for improving its generalization performance were introduced. Then, the deep neural networks with cascaded output layers was described. This deep neural networks have several output layers that constitute sequential classifiers, which combined with the adaboost algorithm, boost the generalization performance. Simultaneously, the proposed design method saves the computation consumption by sharing partial network structure among these classifiers. Finally, experiments were conducted to confirm the effectiveness and reliability of the method. © 2017, Editorial Department of Journal of Tongji University. All right reserved.
引用
收藏
相关论文
共 50 条
  • [1] SPEAKER ADAPTATION OF DEEP NEURAL NETWORKS USING A HIERARCHY OF OUTPUT LAYERS
    Price, Ryan
    Iso, Ken-ichi
    Shinoda, Koichi
    [J]. 2014 IEEE WORKSHOP ON SPOKEN LANGUAGE TECHNOLOGY SLT 2014, 2014, : 153 - 158
  • [2] Restructuring Output Layers of Deep Neural Networks using Minimum Risk Parameter Clustering
    Kubo, Yotaro
    Suzuki, Jun
    Hori, Takaaki
    Nakamura, Atsushi
    [J]. 15TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2014), VOLS 1-4, 2014, : 1068 - 1072
  • [3] A Study on Layers of Deep Neural Networks
    Lim, Hyun-il
    [J]. 2020 THE 3RD INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS'2020), 2020, : 31 - 34
  • [4] Deep Residual Output Layers for Neural Language Generation
    Pappas, Nikolaos
    Henderson, James
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] QDNN: deep neural networks with quantum layers
    Chen Zhao
    Xiao-Shan Gao
    [J]. Quantum Machine Intelligence, 2021, 3
  • [6] QDNN: deep neural networks with quantum layers
    Zhao, Chen
    Gao, Xiao-Shan
    [J]. QUANTUM MACHINE INTELLIGENCE, 2021, 3 (01)
  • [7] LayerOut: Freezing Layers in Deep Neural Networks
    Goutam K.
    Balasubramanian S.
    Gera D.
    Sarma R.R.
    [J]. SN Computer Science, 2020, 1 (5)
  • [8] Output Range Analysis for Deep Feedforward Neural Networks
    Dutta, Souradeep
    Jha, Susmit
    Sanakaranarayanan, Sriram
    Tiwari, Ashish
    [J]. NASA FORMAL METHODS, NFM 2018, 2018, 10811 : 121 - 138
  • [9] Deep neural networks regularization for structured output prediction
    Belharbi, Soufiane
    Herault, Romain
    Chatelain, Clement
    Adam, Sebastien
    [J]. NEUROCOMPUTING, 2018, 281 : 169 - 177
  • [10] Understanding the Distributions of Aggregation Layers in Deep Neural Networks
    Ong, Eng-Jon
    Husain, Sameed
    Bober, Miroslaw
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 5536 - 5550