Adaptive structure feed-forward neural networks using polynomial activation functions

被引:4
|
作者
Ma, L [1 ]
Khorasani, K [1 ]
机构
[1] Concordia Univ, Dept Elect & Comp Engn, Montreal, PQ H3G 1M8, Canada
关键词
adaptive structure nets; functional level adaptation; evolutionary nets; approximation methods; modeling and estimation;
D O I
10.1117/12.380560
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In cascade-correlation (CC) and constructive one-hidden-layer networks, structural level adaptation is achieved by incorporating new hidden units with identical activation functions one at a time into the active evolutionary net. Functional level adaptation has not received considerable attention, since selecting the activation functions will increase the search space considerably, and a systematic and a rigorous algorithm for accomplishing the search will be required as well. In this paper, we present a new strategy that is applicable to both the fixed structure as well as the constructive network trainings by using different activation functions having hierarchical degrees of nonlinearities, as the constructive learning of a one-hidden-layer feed-forward neural network (FNN) is progressing. Specifically, the orthonormal Hermite polynomials are used as the activation functions of the hidden units, which have certain interesting properties that are beneficial in network training. Simulation results for several noisy regression problems have revealed that our scheme can produce FNNs that generalize much better than one-hidden-layer constructive FNNs with identical sigmoidal activation functions, in particular as applied to rather complicated problems.
引用
收藏
页码:120 / 129
页数:10
相关论文
共 50 条
  • [1] Feed-forward neural networks using hermite polynomial activation functions
    Rigatos, Gerasimos G.
    Tzafestas, Spyros G.
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, 3955 : 323 - 333
  • [2] Fractional activation functions in feed-forward artificial neural networks
    Ivanov, Alexander
    [J]. 2018 20TH INTERNATIONAL SYMPOSIUM ON ELECTRICAL APPARATUS AND TECHNOLOGIES (SIELA), 2018,
  • [3] Feed-forward neural networks
    Bebis, George
    Georgiopoulos, Michael
    [J]. IEEE Potentials, 1994, 13 (04): : 27 - 31
  • [4] Catalysis of neural activation functions: Adaptive feed-forward training for big data applications
    Sarkar, Sagnik
    Agrawal, Shaashwat
    Baker, Thar
    Maddikunta, Praveen Kumar Reddy
    Gadekallu, Thippa Reddy
    [J]. APPLIED INTELLIGENCE, 2022, 52 (12) : 13364 - 13383
  • [5] Catalysis of neural activation functions: Adaptive feed-forward training for big data applications
    Sagnik Sarkar
    Shaashwat Agrawal
    Thar Baker
    Praveen Kumar Reddy Maddikunta
    Thippa Reddy Gadekallu
    [J]. Applied Intelligence, 2022, 52 : 13364 - 13383
  • [6] Feed-forward neural networks for secondary structure prediction
    Barlow, T.W.
    [J]. Journal of Molecular Graphics, 1995, 13 (03):
  • [7] A novel activation function for multilayer feed-forward neural networks
    Aboubakar Nasser Samatin Njikam
    Huan Zhao
    [J]. Applied Intelligence, 2016, 45 : 75 - 82
  • [8] A novel activation function for multilayer feed-forward neural networks
    Njikam, Aboubakar Nasser Samatin
    Zhao, Huan
    [J]. APPLIED INTELLIGENCE, 2016, 45 (01) : 75 - 82
  • [9] Evapotranspiration estimation using feed-forward neural networks
    Kisi, Ozgur
    [J]. NORDIC HYDROLOGY, 2006, 37 (03) : 247 - 260
  • [10] Optimal identification using feed-forward neural networks
    Vergara, V
    Sinne, S
    Moraga, C
    [J]. FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 1052 - 1059