Deep neural networks with a set of node-wise varying activation functions

被引:5
|
作者
Jang, Jinhyeok [1 ]
Cho, Hyunjoong [2 ]
Kim, Jaehong [1 ]
Lee, Jaeyeon [1 ]
Yang, Seungjoon [2 ]
机构
[1] Elect & Telecommun Res Inst ETRI, Daejeon, South Korea
[2] Ulsan Natl Inst Sci & Technol UNIST, Sch Elect & Comp Engn, Ulsan, South Korea
关键词
Deep network; Principal component analysis; Pruning; Varying activation;
D O I
10.1016/j.neunet.2020.03.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this study, we present deep neural networks with a set of node-wise varying activation functions. The feature-learning abilities of the nodes are affected by the selected activation functions, where the nodes with smaller indices become increasingly more sensitive during training. As a result, the features learned by the nodes are sorted by the node indices in order of their importance such that more sensitive nodes are related to more important features. The proposed networks learn input features but also the importance of the features. Nodes with lower importance in the proposed networks can be pruned to reduce the complexity of the networks, and the pruned networks can be retrained without incurring performance losses. We validated the feature-sorting property of the proposed method using both shallow and deep networks as well as deep networks transferred from existing networks. (c) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:118 / 131
页数:14
相关论文
共 50 条
  • [21] Smooth Function Approximation by Deep Neural Networks with General Activation Functions
    Ohn, Ilsang
    Kim, Yongdai
    ENTROPY, 2019, 21 (07)
  • [22] Robust Stability of Neural Networks with Discontinuous Activation Functions and Time-Varying Delays
    Liu, Xiaoyang
    Cao, Jinde
    ASCC: 2009 7TH ASIAN CONTROL CONFERENCE, VOLS 1-3, 2009, : 1233 - 1238
  • [23] Anti-Fragmentation of Resting-State Functional Magnetic Resonance Imaging Connectivity Networks with Node-Wise Thresholding
    Hayasaka, Satoru
    BRAIN CONNECTIVITY, 2017, 7 (08) : 504 - 514
  • [24] Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions
    Alkhouly, Asmaa A.
    Mohammed, Ammar
    Hefny, Hesham A.
    IEEE ACCESS, 2021, 9 : 82249 - 82271
  • [25] Adaptive Activation Functions for Skin Lesion Classification Using Deep Neural Networks
    Namozov, Abdulaziz
    Ergashev, Dilshod
    Cho, Young Im
    2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 232 - 235
  • [26] Comparison and Evaluation of Activation Functions in Term of Gradient Instability in Deep Neural Networks
    Liu, Xin
    Zhou, Jun
    Qin, Huimin
    PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 3966 - 3971
  • [27] Rethinking the Role of Activation Functions in Deep Convolutional Neural Networks for Image Classification
    Zheng, Qinghe
    Yang, Mingqiang
    Tian, Xinyu
    Wang, Xiaochen
    Wang, Deqiang
    ENGINEERING LETTERS, 2020, 28 (01) : 80 - 92
  • [28] Non-uniform Piecewise Linear Activation Functions in Deep Neural Networks
    Zhu, Zezhou
    Dong, Yuan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2107 - 2113
  • [29] Improving the Accuracy of Deep Neural Networks Through Developing New Activation Functions
    Mercioni, Marina Adriana
    Tat, Angel Marcel
    Holban, Stefan
    2020 IEEE 16TH INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTER COMMUNICATION AND PROCESSING (ICCP 2020), 2020, : 385 - 391
  • [30] The Limits of SEMA on Distinguishing Similar Activation Functions of Embedded Deep Neural Networks
    Takatoi, Go
    Sugawara, Takeshi
    Sakiyama, Kazuo
    Hara-Azumi, Yuko
    Li, Yang
    APPLIED SCIENCES-BASEL, 2022, 12 (09):