Total least squares for block training of neural networks

被引:0
|
作者
Navia-Vázquez, A [1 ]
Figueiras-Vidal, AR [1 ]
机构
[1] Univ Carlos III Madrid, ATSC DTC, Leganes Madrid 28911, Spain
关键词
perceptron; noise; total least-squares; block training;
D O I
10.1016/S0925-2312(99)00107-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is intended to be a contribution to the better understanding and to the improvement of training methods for neural networks. Instead of the classical gradient descent approach, we adopt another point of view in terms of block least-squares minimizations, finally leading to the inclusion of total least-squares methods into the learning framework. We propose a training method for multilayer perceptrons which combines a reduced computational cost (attributed to block methods in general), a procedure for correcting the well-known sensitivity problems of these approaches, and the layer-wise application of a total least-squares algorithm (high resistance against noise in the data). The new method, which we call reduced sensitivity total least-squares (RS-TLS) training, demonstrates good performance in practical applications. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:213 / 217
页数:5
相关论文
共 50 条
  • [31] A recursive orthogonal least squares algorithm for training RBF networks
    Yu, DL
    Gomm, JB
    Williams, D
    NEURAL PROCESSING LETTERS, 1997, 5 (03) : 167 - 176
  • [32] Recursive orthogonal least squares algorithm for training RBF networks
    Liverpool John Moores Univ, Liverpool, United Kingdom
    Neural Process Letters, 3 (167-176):
  • [33] Application of the block recursive least squares algorithm to adaptive neural beamforming
    DiClaudio, ED
    Parisi, R
    Orlandi, G
    NEURAL NETWORKS FOR SIGNAL PROCESSING VII, 1997, : 560 - 567
  • [34] A successive least squares method for structured total least squares
    Yalamov, PY
    Yuan, JY
    JOURNAL OF COMPUTATIONAL MATHEMATICS, 2003, 21 (04) : 463 - 472
  • [35] Fourier neural networks based on the least squares method research
    Yang, XH
    Mao, JF
    Wang, WL
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES A-MATHEMATICAL ANALYSIS, 2006, 13 : 256 - 263
  • [36] Dimensional and angular measurements using least squares and neural networks
    Tsai, DM
    Tzeng, JI
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 1997, 13 (01): : 56 - 66
  • [37] A Principle of Least Action for the Training of Neural Networks
    Karkar, Skander
    Ayed, Ibrahim
    de Bezenac, Emmanuel
    Gallinari, Patrick
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT II, 2021, 12458 : 101 - 117
  • [38] Least squares and generalized least squares in models with orthogonal block structure
    Fonseca, Miguel
    Mexia, Joao Tiago
    Zmyslony, Roman
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (05) : 1346 - 1352
  • [39] Recursive least squares approach to learning in recurrent neural networks
    Parisi, R
    DiClaudio, ED
    Rapagnetta, A
    Orlandi, G
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 1350 - 1354
  • [40] Dimensional and angular measurements using least squares and neural networks
    Du-Ming Tsai
    Jia-I Tzeng
    The International Journal of Advanced Manufacturing Technology, 1997, 13 : 56 - 66