The Kernel Conjugate Gradient Algorithms

被引:36
|
作者
Zhang, Ming [1 ]
Wang, Xiaojian [1 ]
Chen, Xiaoming [1 ]
Zhang, Anxue [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Elect & Informat Engn, Xian 710049, Shaanxi, Peoples R China
基金
中国国家自然科学基金;
关键词
Reproducing kernel Hilbert space; nonlinear processing; conjugate gradient algorithm; online sparsification; regularization; LEAST-SQUARES ALGORITHM; COMPONENT ANALYSIS; COMPLEX KERNEL; FILTER;
D O I
10.1109/TSP.2018.2853109
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Kernel methods have been successfully applied to nonlinear problems inmachine learning and signal processing. Various kernel-based algorithms have been proposed over the last two decades. In this paper, we investigate the kernel conjugate gradient (KCG) algorithms in both batch and online modes. By expressing the solution vector of CG algorithm as a linear combination of the input vectors and using the kernel trick, we developed the KCG algorithm for batch mode. Because the CG algorithm is iterative in nature, it can greatly reduce the computations by the technique of reduced-rank processing. Moreover, the reduced-rank processing can provide the robustness against the problem of overlearning. The online KCG algorithm is also derived, which converges as fast as the kernel recursive least squares (KRLS) algorithm, but the computational cost is only a quarter of that of the KRLS algorithm. Another attractive feature of the online KCG algorithm compared with other kernel adaptive algorithms is that it does not require the user-defined parameters. To control the growth of data size in online applications, a simple sparsification criterion based on the angles among elements in reproducing kernel Hilbert space is proposed. The angle criterion is equivalent to the coherence criterion but does not require the kernel to be unit norm. Finally, numerical experiments are provided to illustrate the effectiveness of the proposed algorithms.
引用
收藏
页码:4377 / 4387
页数:11
相关论文
共 50 条
  • [41] Conjugate gradient algorithms and the Galerkin boundary element method
    Ademoyero, OO
    Bartholomew-Biggs, MC
    Davies, AJ
    Parkhurst, SC
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2004, 48 (3-4) : 399 - 410
  • [42] CHANGING THE NORM IN CONJUGATE-GRADIENT TYPE ALGORITHMS
    GUTKNECHT, MH
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 1993, 30 (01) : 40 - 56
  • [43] TERMINATION AND EQUIVALENCE RESULTS FOR CONJUGATE-GRADIENT ALGORITHMS
    BUCKLEY, A
    [J]. MATHEMATICAL PROGRAMMING, 1984, 29 (01) : 64 - 76
  • [44] Distributed Kernel-Based Gradient Descent Algorithms
    Shao-Bo Lin
    Ding-Xuan Zhou
    [J]. Constructive Approximation, 2018, 47 : 249 - 276
  • [45] Distributed Kernel-Based Gradient Descent Algorithms
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    [J]. CONSTRUCTIVE APPROXIMATION, 2018, 47 (02) : 249 - 276
  • [46] Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds
    Xiaojing Zhu
    Hiroyuki Sato
    [J]. Advances in Computational Mathematics, 2021, 47
  • [47] Cayley-transform-based gradient and conjugate gradient algorithms on Grassmann manifolds
    Zhu, Xiaojing
    Sato, Hiroyuki
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2021, 47 (04)
  • [49] Kernel Mixture Correntropy Conjugate Gradient Algorithm for Time Series Prediction
    Xue, Nan
    Luo, Xiong
    Gao, Yang
    Wang, Weiping
    Wang, Long
    Huang, Chao
    Zhao, Wenbing
    [J]. ENTROPY, 2019, 21 (08)
  • [50] Preconditioned conjugate gradient algorithms for nonconvex problems with box constraints
    R. Pytlak
    T. Tarnawski
    [J]. Numerische Mathematik, 2010, 116 : 149 - 175