共 50 条
- [2] KAF + RSigELU: a nonlinear and kernel-based activation function for deep neural networks [J]. Neural Computing and Applications, 2022, 34 : 13909 - 13923
- [3] TeLU: A New Activation Function for Deep Learning [J]. 2020 14TH INTERNATIONAL SYMPOSIUM ON ELECTRONICS AND TELECOMMUNICATIONS (ISETC), 2020, : 32 - 35
- [5] KAF plus RSigELU: a nonlinear and kernel-based activation function for deep neural networks [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (16): : 13909 - 13923
- [6] Beyond weights adaptation: A new neuron model with trainable activation function and its supervised learning [J]. 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1152 - 1157
- [7] An Evaluation of Parametric Activation Functions for Deep Learning [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 3006 - 3011
- [9] A neuron model with trainable activation function (TAF) and its MFNN supervised learning [J]. Science in China Series : Information Sciences, 2001, 44 (5): : 366 - 375
- [10] A Universal Activation Function for Deep Learning [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (02): : 3553 - 3569