PRINCIPAL COMPONENTS, MINOR COMPONENTS, AND LINEAR NEURAL NETWORKS

被引:553
|
作者
OJA, E
机构
关键词
NEURAL NETWORKS; GENERALIZED HEBBIAN ALGORITHM; STOCHASTIC GRADIENT ASCENT; SUBSPACE NETWORK; MINOR COMPONENTS; EIGENVECTOR;
D O I
10.1016/S0893-6080(05)80089-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many neural network realizations have been recently proposed for the statistical technique of Principal Component Analysis (PCA). Explicit connections between numerical constrained adaptive algorithms and neural networks with constrained Hebbian learning rules are reviewed. The Stochastic Gradient Ascent (SGA) neural network is proposed and shown to be closely related to the Generalized Hebbian Algorithm (GHA). The SGA behaves better for extracting the less dominant eigenvectors. The SGA algorithm is further extended to the case of learning minor components. The symmetrical Subspace Network is known to give a rotated basis of the dominant eigenvector subspace, but usually not the true eigenvectors themselves. Two extensions are proposed: in the first one, each neuron has a scalar parameter which breaks the symmetry. True eigenvectors are obtained in a local and fully parallel learning rule. In the second one, the case of an arbitrary number of parallel neurons is considered, not necessarily less than the input vector dimension.
引用
收藏
页码:927 / 935
页数:9
相关论文
共 50 条
  • [1] Feedforward neural networks for principal components extraction
    Nicole, S
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2000, 33 (04) : 425 - 437
  • [2] Neural networks for seismic principal components analysis
    Huang, KY
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 1999, 37 (01): : 297 - 311
  • [3] New algorithm for principal components and minor components extraction
    Chen, TP
    Lin, Q
    [J]. PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2, 1998, : 560 - 563
  • [4] A symmetric linear neural network that learns principal components and their variances
    Peper, F
    Noda, H
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (04): : 1042 - 1047
  • [5] Multiple linear regression and artificial neural networks based on principal components to predict ozone concentrations
    Sousa, S. I. V.
    Martins, F. G.
    Alvim-Ferraz, M. C. M.
    Pereira, M. C.
    [J]. ENVIRONMENTAL MODELLING & SOFTWARE, 2007, 22 (01) : 97 - 103
  • [6] Speech Recognition Using Principal Components Analysis and Neural Networks
    Shabani, Shaham
    Norouzi, Yaser
    [J]. 2016 IEEE 8TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS (IS), 2016, : 90 - 95
  • [7] A unified algorithm for principal and minor components extraction
    Chen, TP
    Amari, SI
    Lin, Q
    [J]. NEURAL NETWORKS, 1998, 11 (03) : 385 - 390
  • [8] Principal Components Bias in Over-parameterized Linear Models, and its Manifestation in Deep Neural Networks
    Hacohen, Guy
    Weinshall, Daphna
    [J]. Journal of Machine Learning Research, 2022, 23
  • [9] Principal Components Bias in Over-parameterized Linear Models, and its Manifestation in Deep Neural Networks
    Hacohen, Guy
    Weinshall, Daphna
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [10] Dynamic selection of model parameters in principal components analysis neural networks
    López-Rubio, E
    Ortiz-de-Lazcano-Lobato, JM
    Vargas-González, MDC
    López-Rubio, JM
    [J]. ECAI 2004: 16TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 110 : 618 - 622