Training Single Hidden Layer Feedforward Neural Networks by Singular Value Decomposition

被引:0
|
作者
Hieu Trung Huynh [1 ]
Won, Yonggwan [1 ]
机构
[1] Chonnam Natl Univ, Dept Comp Engn, Kwangju 500757, South Korea
关键词
neural network; SLFNs; training algorithms; SVD; SVD-neural classifier; EXTREME LEARNING-MACHINE; NUMBER;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Training neural networks has attracted many researchers for a long time. Many training algorithms and their improvements have been proposed. However, up to now, improving performance of training algorithms for neural networks is still a challenge. In this paper, we investigate a new training method for single hidden layer feedforvvard neural networks (SLFNs) which use 'tansig' activation function. The proposed training algorithm uses SVD (Singular Value Decomposition) to calculate the network parameters. It is simple and has low computational complexity. Experimental results show that the proposed approach can obtain good performance with a compact network which has small number of hidden units.
引用
收藏
页码:1300 / 1304
页数:5
相关论文
共 50 条
  • [41] Hidden layer evaluation method for feedforward neural networks based on subspace analysis
    Piao, Xiang-Fan
    Cui, Rong-Yi
    Hong, Bing-Rong
    Li, Bai-Ya
    [J]. Journal of Hunan University of Science and Technology, 2004, 19 (04):
  • [42] Approximation capability of two hidden layer feedforward neural networks with fixed weights
    Guliyev, Namig J.
    Ismailov, Vugar E.
    [J]. NEUROCOMPUTING, 2018, 316 : 262 - 269
  • [43] Decoding Cognitive States from fMRI Data Using Single Hidden-Layer Feedforward Neural Networks
    Huynh, Hieu Trung
    Won, Yonggwan
    [J]. NCM 2008 : 4TH INTERNATIONAL CONFERENCE ON NETWORKED COMPUTING AND ADVANCED INFORMATION MANAGEMENT, VOL 1, PROCEEDINGS, 2008, : 256 - 260
  • [44] Simultaneous Approximation Algorithm Using a Feedforward Neural Network with a Single Hidden Layer
    Hahm, Nahmwoo
    Hong, Bum Il
    [J]. JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 2009, 54 (06) : 2219 - 2224
  • [45] Optimal hidden structure for feedforward neural networks
    Bachiller, P.
    Perez, R. M.
    Martinez, P.
    Aguilar, P. L.
    Diaz, P.
    [J]. COMPUTATIONAL INTELLIGENCE: THEORY AND APPLICATIONS, 1999, 1625 : 684 - 685
  • [46] Training neural networks by marginalizing out hidden layer noise
    Yanjun Li
    Ping Guo
    [J]. Neural Computing and Applications, 2018, 29 : 401 - 412
  • [47] Training neural networks by marginalizing out hidden layer noise
    Li, Yanjun
    Guo, Ping
    [J]. NEURAL COMPUTING & APPLICATIONS, 2018, 29 (09): : 401 - 412
  • [48] Rates of convergence for adaptive regression estimates with multiple hidden layer feedforward neural networks
    Kohler, M
    Krzyzak, A
    [J]. 2005 IEEE International Symposium on Information Theory (ISIT), Vols 1 and 2, 2005, : 1436 - 1440
  • [49] Accelerated Optimal Topology Search for Two-Hidden-Layer Feedforward Neural Networks
    Thomas, Alan J.
    Walters, Simon D.
    Petridis, Miltos
    Gheytassi, Saeed Malekshahi
    Morgan, Robert E.
    [J]. ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2016, 2016, 629 : 253 - 266
  • [50] Multi-criteria decision making based architecture selection for single-hidden layer feedforward neural networks
    Ran Wang
    Haoran Xie
    Jiqiang Feng
    Fu Lee Wang
    Chen Xu
    [J]. International Journal of Machine Learning and Cybernetics, 2019, 10 : 655 - 666