PRINCIPAL COMPONENTS, MINOR COMPONENTS, AND LINEAR NEURAL NETWORKS

被引:553
|
作者
OJA, E
机构
关键词
NEURAL NETWORKS; GENERALIZED HEBBIAN ALGORITHM; STOCHASTIC GRADIENT ASCENT; SUBSPACE NETWORK; MINOR COMPONENTS; EIGENVECTOR;
D O I
10.1016/S0893-6080(05)80089-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many neural network realizations have been recently proposed for the statistical technique of Principal Component Analysis (PCA). Explicit connections between numerical constrained adaptive algorithms and neural networks with constrained Hebbian learning rules are reviewed. The Stochastic Gradient Ascent (SGA) neural network is proposed and shown to be closely related to the Generalized Hebbian Algorithm (GHA). The SGA behaves better for extracting the less dominant eigenvectors. The SGA algorithm is further extended to the case of learning minor components. The symmetrical Subspace Network is known to give a rotated basis of the dominant eigenvector subspace, but usually not the true eigenvectors themselves. Two extensions are proposed: in the first one, each neuron has a scalar parameter which breaks the symmetry. True eigenvectors are obtained in a local and fully parallel learning rule. In the second one, the case of an arbitrary number of parallel neurons is considered, not necessarily less than the input vector dimension.
引用
收藏
页码:927 / 935
页数:9
相关论文
共 50 条
  • [31] Unified stabilization approach to principal and minor components extraction algorithms
    Chen, TP
    Amari, S
    [J]. NEURAL NETWORKS, 2001, 14 (10) : 1377 - 1387
  • [32] DISTRIBUTED PRINCIPAL COMPONENTS ANALYSIS IN SENSOR NETWORKS
    Aduroja, Abiodun
    Schizas, Ioannis D.
    Maroulas, Vasileios
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 5850 - 5854
  • [33] Procedure for the Selection of Principal Components in Principal Components Regression
    Kim, Bu-Yong
    Shin, Myung-Hee
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2010, 23 (05) : 967 - 975
  • [34] Forecasting realized volatility: HAR against Principal Components Combining, neural networks and GARCH
    Vortelinos, Dimitrios I.
    [J]. RESEARCH IN INTERNATIONAL BUSINESS AND FINANCE, 2017, 39 : 824 - 839
  • [35] Identifying apple surface defects using principal components analysis and artificial neural networks
    Bennedsen, B. S.
    Peterson, D. L.
    Tabb, A.
    [J]. TRANSACTIONS OF THE ASABE, 2007, 50 (06) : 2257 - 2265
  • [36] TERRAIN CLASSIFICATION IN SAR IMAGES USING PRINCIPAL COMPONENTS-ANALYSIS AND NEURAL NETWORKS
    AZIMISADJADI, MR
    GHALOUM, S
    ZOUGHI, R
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 1993, 31 (02): : 511 - 515
  • [37] Use of variable marker density, principal components, and neural networks in the dissection of disease etiology
    Pankratz, N
    Kirkwood, SC
    Flury, L
    Koller, DL
    Foroud, T
    [J]. GENETIC EPIDEMIOLOGY, 2001, 21 : S732 - S737
  • [38] Principal components
    Girshick, MA
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 1936, 31 (193) : 519 - 528
  • [39] Linear weighted watermarking using normalized principal components
    Sangeetha, N.
    Anita, X.
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2018, 4 (03) : 181 - 193
  • [40] Non-linear principal components: projection and reconstruction
    MacDonald, D
    Fyfe, C
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2153 - 2157