On the performance of pairings of activation and loss functions in neural networks

被引:0
|
作者
Soares, Rodrigo G. F. [1 ]
Pereira, Enieson J. S. [1 ]
机构
[1] Univ Fed Rural Pernambuco, Dept Stat & Informat, Recife, PE, Brazil
关键词
ALGORITHM; CLASSIFICATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The selection of parameters is one of the most important tasks in the training of a neural network. The choice of activation and loss functions is particularly relevant as the formulation of training procedures strongly depends on the pairing of these functions. However, the very few works on the effect of different combinations of these functions do not present a comprehensive experimental study on classification and only investigate a few pairings. This paper provides a significant empirical analysis on the selection of such pairings. Our work presents the formulations of Iterative Reweighted Least Squares for nine pairings of most common activation and loss functions. We investigated the impact of these formulations, including natural pairings, on both binary and multi-class classification in artificial and real-world datasets. Our results show that, for multi-class classification, one should select an activation and a loss function that form a natural pairing in order to generate an effective weight update rule. For binary classification, although different pairings produced statistically different mean average precisions, natural pairings were not a significant factor in the generalisation performance of a neural network.
引用
收藏
页码:326 / 333
页数:8
相关论文
共 50 条
  • [1] The loss surfaces of neural networks with general activation functions
    Baskerville, Nicholas P.
    Keating, Jonathan P.
    Mezzadri, Francesco
    Najnudel, Joseph
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (06):
  • [2] Improving the Performance of Neural Networks with an Ensemble of Activation Functions
    Nandi, Arijit
    Jana, Nanda Dulal
    Das, Swagatam
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [3] Wavelets as activation functions in Neural Networks
    Herrera, Oscar
    Priego, Belem
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (05) : 4345 - 4355
  • [4] Simple activation functions for neural and fuzzy neural networks
    Mendil, B
    Benmahammed, K
    [J]. ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 347 - 350
  • [5] Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions
    Alkhouly, Asmaa A.
    Mohammed, Ammar
    Hefny, Hesham A.
    [J]. IEEE ACCESS, 2021, 9 : 82249 - 82271
  • [6] Optimizing performance of feedforward and convolutional neural networks through dynamic activation functions
    Rane, Chinmay
    Tyagi, Kanishka
    Kline, Adrienne
    Chugh, Tushar
    Manry, Michael
    [J]. EVOLUTIONARY INTELLIGENCE, 2024, 17 (5-6) : 4083 - 4093
  • [7] Loss Functions for Image Restoration With Neural Networks
    Zhao, Hang
    Gallo, Orazio
    Frosio, Iuri
    Kautz, Jan
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2017, 3 (01) : 47 - 57
  • [8] Quantum activation functions for quantum neural networks
    Marco Maronese
    Claudio Destri
    Enrico Prati
    [J]. Quantum Information Processing, 21
  • [9] A Comparison of Activation Functions in Artificial Neural Networks
    Bircanoglu, Cenk
    Arica, Nafiz
    [J]. 2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [10] HOLDER CONTINUOUS ACTIVATION FUNCTIONS IN NEURAL NETWORKS
    Tatar, Nasser-Eddine
    [J]. ADVANCES IN DIFFERENTIAL EQUATIONS AND CONTROL PROCESSES, 2015, 15 (02): : 93 - 106