Cascade Principal Component Least Squares Neural Network Learning Algorithm

被引:0
|
作者
Khan, Waqar Ahmed [1 ]
Chung, Sai-Ho [1 ]
Chan, Ching Yuen [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Ind & Syst Engn, Hong Kong, Peoples R China
关键词
cascading correlation learning; connection weights; principal component analysis; ordinary least squares; cascade principal component least squares;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Cascading correlation learning (CasCor) is a constructive algorithm which determines its own network size and typology by adding hidden units one at a time based on covariance with output error. Its generalization performance and computational time depends on the cascade architecture and iteratively tuning of the connection weights. CasCor was developed to address the slowness of backpropagation (BP), however, recent studies have concluded that in many applications, CasCor generalization performance does not guarantee to be optimal. Apart from BP, CasCor learning speed can be considered slow because of iterative tuning of connection weights by numerical optimization techniques. Therefore, this paper addresses CasCor bottlenecks and introduces a new algorithm with improved cascade architecture and tuning free learning to achieve the objectives of better generalization performance and fast learning ability. The proposed algorithm determines input connection weights by orthogonally transforming a set of correlated input units into uncorrelated hidden units and output connection weights by considering hidden units and the output units in a linear relationship. This research work is unique in that it does not need a random generation of connection weights. A comparative study on nonlinear classification and regression tasks has proven that the proposed algorithm has better generalization performance and learns many times faster than CasCor.
引用
收藏
页码:56 / 61
页数:6
相关论文
共 50 条
  • [1] Robust recursive least squares learning algorithm for principal component analysis
    Ouyang, S
    Bao, Z
    Liao, GS
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (01): : 215 - 221
  • [2] FAST RECURSIVE LEAST SQUARES LEARNING ALGORITHM FOR PRINCIPAL COMPONENT ANALYSIS
    Ouyang Shan Bao Zheng Liao Guisheng(Guilin Institute of Electronic Technology
    [J]. Journal of Electronics(China), 2000, (03) : 270 - 278
  • [3] Orthogonal Least Squares Algorithm for Training Cascade Neural Networks
    Huang, Gao
    Song, Shiji
    Wu, Cheng
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2012, 59 (11) : 2629 - 2637
  • [4] A Hybrid Least Squares and Principal Component Analysis Algorithm for Raman Spectroscopy
    Van de Sompel, Dominique
    Garai, Ellis
    Zavaleta, Cristina
    Gambhir, Sanjiv Sam
    [J]. 2011 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2011, : 6971 - 6974
  • [5] A Hybrid Least Squares and Principal Component Analysis Algorithm for Raman Spectroscopy
    Van de Sompel, Dominique
    Garai, Ellis
    Zavaleta, Cristina
    Gambhir, Sanjiv Sam
    [J]. PLOS ONE, 2012, 7 (06):
  • [6] PRINCIPAL COMPONENT EXTRACTION USING RECURSIVE LEAST-SQUARES LEARNING
    BANNOUR, S
    AZIMISADJADI, MR
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (02): : 457 - 469
  • [7] Principal component extraction using recursive least squares learning - Comment
    Miao, YF
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1996, 7 (04): : 1052 - 1052
  • [8] Nonnegative Least Squares Learning for the Random Neural Network
    Timotheou, Stelios
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT I, 2008, 5163 : 195 - 204
  • [9] Shrunken Principal Component Least Squares Estimator
    Sheng Zining
    [J]. COMPREHENSIVE EVALUATION OF ECONOMY AND SOCIETY WITH STATISTICAL SCIENCE, 2009, : 1022 - 1025
  • [10] Robust learning in a partial least-squares neural network
    Ham, FM
    McDowall, TM
    [J]. NONLINEAR ANALYSIS-THEORY METHODS & APPLICATIONS, 1997, 30 (05) : 2903 - 2914