Formal Neuron Based on Adaptive Parametric Rectified Linear Activation Function and its Learning

被引:0
|
作者
Bodyanskiy, Yevgeniy [1 ]
Deineko, Anastasiia [2 ]
Pliss, Iryna [1 ]
Slepanska, Valeriia [2 ]
机构
[1] Kharkiv Natl Univ Radio Elect, Control Syst Res Lab, Kharkiv, Ukraine
[2] Kharkiv Natl Univ Radio Elect, Dept Artificial Intelligence, Kharkiv, Ukraine
关键词
deep neural network; adaptive activation function; delta-rule; synaptic weights; rectified linear unit; learning algorithm;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The paper proposes an adaptive activation function (AdPReLU) for deep neural networks which is generalization of rectified unit family, differing by opportunity of online tuning its parameters during the learning process of neural network. The learning algorithm of formal neuron with adaptive activation function which is generalization of delta-rule and in which parameters of the function tune simultaneously with synaptic weights, based on error back-propagation is developed. The proposed algorithm of tuning is optimized for increasing of operating speed. Computational experiments confirm the effectiveness of the approach under consideration.
引用
收藏
页码:14 / 22
页数:9
相关论文
共 50 条
  • [1] Fractional Rectified Linear Unit Activation Function and Its Variants
    Job, Megha S.
    Bhateja, Priyanka H.
    Gupta, Muskan
    Bingi, Kishore
    Prusty, B. Rajanarayan
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [2] PARAMETRIC FLATTEN-T SWISH: AN ADAPTIVE NON-LINEAR ACTIVATION FUNCTION FOR DEEP LEARNING
    Chieng, Hock Hung
    Wahid, Noorhaniza
    Ong, Pauline
    [J]. JOURNAL OF INFORMATION AND COMMUNICATION TECHNOLOGY-MALAYSIA, 2021, 20 (01): : 21 - 39
  • [3] On-line Learning Adaptive Control Based on Linear Neuron
    Li, Chuanqing
    [J]. 2011 9TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION (WCICA 2011), 2011, : 254 - 259
  • [4] Analysis of Function of Rectified Linear Unit Used in Deep learning
    Hara, Kazuyuki
    Saito, Daisuke
    Shouno, Hayaru
    [J]. 2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [5] Low-Power Hardware Implementation for Parametric Rectified Linear Unit Function
    Wu, Yu-Hsuan
    Lin, Wei-Hung
    Huang, Shih-Hsu
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [6] Justification of a neuron-adaptive activation function
    Xu, SX
    Zhang, M
    [J]. IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL III, 2000, : 465 - 470
  • [7] Deep Learning with S-Shaped Rectified Linear Activation Units
    Jin, Xiaojie
    Xu, Chunyan
    Feng, Jiashi
    Wei, Yunchao
    Xiong, Junjun
    Yan, Shuicheng
    [J]. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1737 - 1743
  • [8] A neuron model with trainable activation function (TAF) and its MFNN supervised learning
    吴佑寿
    赵明生
    [J]. Science China(Information Sciences), 2001, (05) : 366 - 375
  • [9] A neuron model with trainable activation function (TAF) and its MFNN supervised learning
    Youshou Wu
    Mingsheng Zhao
    [J]. Science in China Series : Information Sciences, 2001, 44 (5): : 366 - 375
  • [10] DPReLU: Dynamic Parametric Rectified Linear Unit and Its Proper Weight Initialization Method
    Yang, Donghun
    Ngoc, Kien Mai
    Shin, Iksoo
    Hwang, Myunggwon
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2023, 16 (01)