Self-Adaptive Layer: An Application of Function Approximation Theory to Enhance Convergence Efficiency in Neural Networks

被引:0
|
作者
Chan, Ka-Hou [1 ]
Im, Sio-Kei [2 ]
Ke, Wei [2 ]
机构
[1] Macao Polytech Inst, Sch Appl Sci, Macau, Peoples R China
[2] Macao Polytech Inst, Macau, Peoples R China
关键词
Function Approximation; Orthogonal Polynomial; Self-Adaptive; Neural Network;
D O I
10.1109/icoin48656.2020.9016534
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Neural networks provide a general architecture to model complex nonlinear systems, but the source data are often mixed with a lot of noise and interference information. One way to offer a smoother alternative for addressing this issue in training is to increase the neural or layer size. In this paper, a new self-adaptive layer is developed to overcome the problems of neural networks so as to achieve faster convergence and avoid local minimum. We incorporate function approximation theory into the layer element arrangement, so that the training process and the network approximation properties can be investigated via linear algebra, where the precision of adaptation can be controlled by the order of polynomials being used. Experimental results show that our proposed layer leads to significantly faster performance in convergence. As a result, this new layer greatly enhances the training accuracy. Moreover, the design and implementation can be easily deployed in most current systems.
引用
收藏
页码:447 / 452
页数:6
相关论文
共 50 条
  • [41] Improving pattern discovery and visualisation with self-adaptive neural networks through data transformations
    Huiru Zheng
    Haiying Wang
    [J]. International Journal of Machine Learning and Cybernetics, 2012, 3 : 173 - 182
  • [42] Improving pattern discovery and visualisation with self-adaptive neural networks through data transformations
    Zheng, Huiru
    Wang, Haiying
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2012, 3 (03) : 173 - 182
  • [43] Growing axons: greedy learning of neural networks with application to function approximation
    Fokina, Daria
    Oseledets, Ivan
    [J]. RUSSIAN JOURNAL OF NUMERICAL ANALYSIS AND MATHEMATICAL MODELLING, 2023, 38 (01) : 1 - 12
  • [44] Data processing and feature screening in function approximation: An application to neural networks
    Cancelliere, R
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2003, 46 (2-3) : 455 - 461
  • [45] Constructive function-approximation by three-layer artificial neural networks
    Suzuki, S
    [J]. NEURAL NETWORKS, 1998, 11 (06) : 1049 - 1058
  • [46] Rates of convergence for adaptive regression estimates with multiple hidden layer feedforward neural networks
    Kohler, M
    Krzyzak, A
    [J]. 2005 IEEE International Symposium on Information Theory (ISIT), Vols 1 and 2, 2005, : 1436 - 1440
  • [47] Self-Adaptive Mussels Wandering Optimization Algorithm with Application for Artificial Neural Network Training
    Abusnaina, Ahmed A.
    Abdullah, Rosni
    Kattan, Ali
    [J]. JOURNAL OF INTELLIGENT SYSTEMS, 2020, 29 (01) : 345 - 363
  • [49] Self-adaptive equation embedded neural networks for traffic flow state estimation with sparse data
    Su, Yuan-Bo
    Lü, Xing
    Li, Shu-Kai
    Yang, Li-Xing
    Gao, Ziyou
    [J]. Physics of Fluids, 2024, 36 (10)
  • [50] Improving the performance of mutation-based evolving artificial neural networks with self-adaptive mutations
    Hiraga, Motoaki
    Komura, Masahiro
    Miyamoto, Akiharu
    Morimoto, Daichi
    Ohkura, Kazuhiro
    [J]. PLOS ONE, 2024, 19 (07):