Total least squares for block training of neural networks

被引:0
|
作者
Navia-Vázquez, A [1 ]
Figueiras-Vidal, AR [1 ]
机构
[1] Univ Carlos III Madrid, ATSC DTC, Leganes Madrid 28911, Spain
关键词
perceptron; noise; total least-squares; block training;
D O I
10.1016/S0925-2312(99)00107-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is intended to be a contribution to the better understanding and to the improvement of training methods for neural networks. Instead of the classical gradient descent approach, we adopt another point of view in terms of block least-squares minimizations, finally leading to the inclusion of total least-squares methods into the learning framework. We propose a training method for multilayer perceptrons which combines a reduced computational cost (attributed to block methods in general), a procedure for correcting the well-known sensitivity problems of these approaches, and the layer-wise application of a total least-squares algorithm (high resistance against noise in the data). The new method, which we call reduced sensitivity total least-squares (RS-TLS) training, demonstrates good performance in practical applications. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:213 / 217
页数:5
相关论文
共 50 条
  • [1] An efficient recursive total least squares algorithm for training multilayer feedforward neural networks
    Choi, NJ
    Lim, JS
    Sung, KM
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 558 - 565
  • [2] Orthogonal Least Squares Algorithm for Training Cascade Neural Networks
    Huang, Gao
    Song, Shiji
    Wu, Cheng
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2012, 59 (11) : 2629 - 2637
  • [3] Recursive least squares method for training and pruning convolutional neural networks
    Yu, Tianzong
    Zhang, Chunyuan
    Ma, Meng
    Wang, Yuan
    [J]. APPLIED INTELLIGENCE, 2023, 53 (20) : 24603 - 24618
  • [4] A local linearized least squares algorithm for training feedforward neural networks
    Stan, O
    Kamen, E
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (02): : 487 - 495
  • [5] Recursive least squares method for training and pruning convolutional neural networks
    Tianzong Yu
    Chunyuan Zhang
    Meng Ma
    Yuan Wang
    [J]. Applied Intelligence, 2023, 53 : 24603 - 24618
  • [6] On least trimmed squares neural networks
    Lin, Yih-Lon
    Hsieh, Jer-Guang
    Jeng, Jyh-Horng
    Cheng, Wen-Chin
    [J]. NEUROCOMPUTING, 2015, 161 : 107 - 112
  • [7] Total least squares versus RBF neural networks in static calibration of transducers
    Kluk, P
    Misiurski, G
    Morawski, RZ
    [J]. IMTC/97 - IEEE INSTRUMENTATION & MEASUREMENT TECHNOLOGY CONFERENCE: SENSING, PROCESSING, NETWORKING, PROCEEDINGS VOLS 1 AND 2, 1997, : 424 - 427
  • [8] AN ADAPTIVE LEAST-SQUARES ALGORITHM FOR THE EFFICIENT TRAINING OF ARTIFICIAL NEURAL NETWORKS
    KOLLIAS, S
    ANASTASSIOU, D
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS, 1989, 36 (08): : 1092 - 1101
  • [9] A new local linearized least squares algorithm for training feedforward neural networks
    Stan, O
    Kamen, EW
    [J]. 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 1873 - 1878
  • [10] ACTIVE NEURON LEAST SQUARES: A TRAINING METHOD FOR MULTIVARIATE RECTIFIED NEURAL NETWORKS
    Ainsworth, Mark
    Shin, Yeonjong
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (04): : A2253 - A2275