Regularization for the Kernel Recursive Least Squares CMAC

被引:0
|
作者
Laufer, C. [1 ]
Coghill, G. [1 ]
机构
[1] Univ Auckland, Elect & Elect Engn Dept, Auckland 1, New Zealand
关键词
ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. In recent works, the kernel recursive least squares CMAC (KRLS-CMAC) was proposed as a superior alternative to the standard CMAC as it converges faster, does not require tuning of a learning rate parameter, and is much better at modeling. The KRLS-CMAC however, still suffered from the learning interference problem. Learning interference was addressed in the standard CMAC by regularization. Previous works have also applied regularization to kernelized CMACs, however they were not computationally feasible for large resolutions and dimensionalities. This paper brings the regularization technique to the KRLS-CMAC in a way that allows it to be used efficiently in multiple dimensions with infinite resolution kernel functions.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Multivariate online anomaly detection using kernel recursive least squares
    Ahmed, Tarem
    Coates, Mark
    Lakhina, Anukool
    INFOCOM 2007, VOLS 1-5, 2007, : 625 - +
  • [32] Extended Kalman Filter Using a Kernel Recursive Least Squares Observer
    Zhu, Pingping
    Chen, Badong
    Principe, Jose C.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 1402 - 1408
  • [33] A Microcoded Kernel Recursive Least Squares Processor Using FPGA Technology
    Pang, Yeyong
    Wang, Shaojun
    Peng, Yu
    Peng, Xiyuan
    Fraser, Nicholas J.
    Leong, Philip H. W.
    ACM TRANSACTIONS ON RECONFIGURABLE TECHNOLOGY AND SYSTEMS, 2016, 10 (01)
  • [34] Kernel Recursive Least Squares With Multiple Feedback and Its Convergence Analysis
    Wang, Shiyuan
    Wang, Wanli
    Duan, Shukai
    Wang, Lidan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2017, 64 (10) : 1237 - 1241
  • [35] Kernel Recursive Least Squares-Type Neuron for Nonlinear Equalization
    Tehrani, Mohammed Naseri
    Shakhsi, Majid
    Khoshbin, Hossein
    2013 21ST IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2013,
  • [36] Convergence and performance analysis of kernel regularized robust recursive least squares
    Sadigh, Alireza Naeimi
    Yazdi, Hadi Sadoghi
    Harati, Ahad
    ISA TRANSACTIONS, 2020, 105 (105) : 396 - 405
  • [37] SPARSE KERNEL RECURSIVE LEAST SQUARES USING L1 REGULARIZATION AND A FIXED-POINT SUB-ITERATION
    Chen, Badong
    Zheng, Nanning
    Principe, Jose C.
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [38] A recursive least squares algorithm with l1 regularization for sparse representation
    Liu, Di
    Baldi, Simone
    Liu, Quan
    Yu, Wenwu
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (02)
  • [39] An Array Recursive Least-Squares Algorithm With Generic Nonfading Regularization Matrix
    Tsakiris, Manolis C.
    Lopes, Cassio G.
    Nascimento, Vitor H.
    IEEE SIGNAL PROCESSING LETTERS, 2010, 17 (12) : 1001 - 1004
  • [40] Recursive Least Squares With Minimax Concave Penalty Regularization for Adaptive System Identification
    Li, Bowen
    Wu, Suya
    Tripp, Erin E.
    Pezeshki, Ali
    Tarokh, Vahid
    IEEE ACCESS, 2024, 12 : 66993 - 67004