Periodic Function as Activation Function for Neural Networks

被引:0
|
作者
Xu, Ding [1 ]
Guan, Yue [1 ]
Cai, Ping-ping [1 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Software Engn, Wuhan 430074, Peoples R China
关键词
Machine learning; Neural network; Convolutional neural network; Activation function;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we explore the periodic function as alternative activation function for neural network. Previously sigmoid function is used as standard activation function for neuron and now linear rectifier function are used. Even max-out function can be learned as a general form convex activation function. We explore the possibility in the other direction, where we use periodic function as activation function. We expect network with less layer and less neuron can capture the target distribution. The experiments verify our expectation and show that period function can act as an alternative activation function.
引用
收藏
页码:179 / 183
页数:5
相关论文
共 50 条
  • [21] An adaptive activation function for higher order neural networks
    Xu, SX
    Zhang, M
    [J]. AL 2002: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2002, 2557 : 356 - 362
  • [22] A Quantum Activation Function for Neural Networks: Proposal and Implementation
    Kumar, Saurabh
    Dangwal, Siddharth
    Adhikary, Soumik
    Bhowmik, Debanjan
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [23] Square Root Based Activation Function in Neural Networks
    Yang, Xiaoyu
    Chen, Yufei
    Liang, Haiquan
    [J]. 2018 INTERNATIONAL CONFERENCE ON AUDIO, LANGUAGE AND IMAGE PROCESSING (ICALIP), 2018, : 84 - 89
  • [24] RSigELU: A nonlinear activation function for deep neural networks
    Kilicarslan, Serhat
    Celik, Mete
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 174
  • [25] On the Impact of the Activation Function on Deep Neural Networks Training
    Hayou, Soufiane
    Doucet, Arnaud
    Rousseau, Judith
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [26] Algorithm Research on Improving Activation Function of Convolutional Neural Networks
    Guo, Yanhua
    Sun, Lei
    Zhang, Zhihong
    He, Hong
    [J]. PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 3582 - 3586
  • [27] Sound synthesis by flexible activation function recurrent neural networks
    Uncini, A
    [J]. NEURAL NETS, 2002, 2486 : 168 - 177
  • [28] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    [J]. SYMMETRY-BASEL, 2022, 14 (05):
  • [29] ReAFM: A Reconfigurable Nonlinear Activation Function Module for Neural Networks
    Wu, Xiao
    Liang, Shuang
    Wang, Meiqi
    Wang, Zhongfeng
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2023, 70 (07) : 2660 - 2664
  • [30] On the Universally Optimal Activation Function for a Class of Residual Neural Networks
    Zhao, Feng
    Huang, Shao-Lun
    [J]. APPLIEDMATH, 2022, 2 (04): : 574 - 584