Deep neural networks with a set of node-wise varying activation functions

被引:5
|
作者
Jang, Jinhyeok [1 ]
Cho, Hyunjoong [2 ]
Kim, Jaehong [1 ]
Lee, Jaeyeon [1 ]
Yang, Seungjoon [2 ]
机构
[1] Elect & Telecommun Res Inst ETRI, Daejeon, South Korea
[2] Ulsan Natl Inst Sci & Technol UNIST, Sch Elect & Comp Engn, Ulsan, South Korea
关键词
Deep network; Principal component analysis; Pruning; Varying activation;
D O I
10.1016/j.neunet.2020.03.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this study, we present deep neural networks with a set of node-wise varying activation functions. The feature-learning abilities of the nodes are affected by the selected activation functions, where the nodes with smaller indices become increasingly more sensitive during training. As a result, the features learned by the nodes are sorted by the node indices in order of their importance such that more sensitive nodes are related to more important features. The proposed networks learn input features but also the importance of the features. Nodes with lower importance in the proposed networks can be pruned to reduce the complexity of the networks, and the pruned networks can be retrained without incurring performance losses. We validated the feature-sorting property of the proposed method using both shallow and deep networks as well as deep networks transferred from existing networks. (c) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页码:118 / 131
页数:14
相关论文
共 50 条
  • [41] Multistability of Recurrent Neural Networks With Nonmonotonic Activation Functions and Unbounded Time-Varying Delays
    Liu, Peng
    Zeng, Zhigang
    Wang, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2018, 29 (07) : 3000 - 3010
  • [42] Simple activation functions for neural and fuzzy neural networks
    Mendil, B
    Benmahammed, K
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 347 - 350
  • [43] Simple activation functions for neural and fuzzy neural networks
    Mendil, Boubekeur
    Benmahammed, K.
    Proceedings - IEEE International Symposium on Circuits and Systems, 1999, 5
  • [44] Learning continuous piecewise non-linear activation functions for deep neural networks
    Gao, Xinchen
    Li, Yawei
    Li, Wen
    Duan, Lixin
    Van Gool, Luc
    Benini, Luca
    Magno, Michele
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1835 - 1840
  • [45] Genetic Deep Neural Networks Using Different Activation Functions for Financial Data Mining
    Zhang, Luna M.
    PROCEEDINGS 2015 IEEE INTERNATIONAL CONFERENCE ON BIG DATA, 2015, : 2849 - 2851
  • [46] An Efficient Hardware Implementation of Activation Functions Using Stochastic Computing for Deep Neural Networks
    Van-Tinh Nguyen
    Tieu-Khanh Luong
    Han Le Duc
    Van-Phuc Hoang
    2018 IEEE 12TH INTERNATIONAL SYMPOSIUM ON EMBEDDED MULTICORE/MANY-CORE SYSTEMS-ON-CHIP (MCSOC 2018), 2018, : 233 - 236
  • [47] Uniform Convergence of Deep Neural Networks With Lipschitz Continuous Activation Functions and Variable Widths
    Xu, Yuesheng
    Zhang, Haizhang
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (10) : 7125 - 7142
  • [48] Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
    Jagtap, Ameya D.
    Kawaguchi, Kenji
    Karniadakis, George Em
    JOURNAL OF COMPUTATIONAL PHYSICS, 2020, 404 (404)
  • [49] Fast Approximations of Activation Functions in Deep Neural Networks when using Posit Arithmetic
    Cococcioni, Marco
    Rossi, Federico
    Ruffaldi, Emanuele
    Saponara, Sergio
    SENSORS, 2020, 20 (05)
  • [50] Approximation by neural networks with weights varying on a finite set of directions
    Ismailov, Vugar E.
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2012, 389 (01) : 72 - 83