Direct simplification for kernel regression machines

被引:0
|
作者
He, Wenwu [1 ]
Wang, Zhizhong [2 ]
机构
[1] Fujian Univ Technol, Dept Math & Phys, Fuzhou 350108, Fujian, Peoples R China
[2] Cent S Univ, Sch Math Sci & Comp Technol, Changsha 410075, Peoples R China
关键词
Kernel regression machines; Direct simplification; Cholesky factorization; Newton method;
D O I
10.1016/j.neucom.2008.05.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel machines have been widely used in learning. However, standard algorithms are often time consuming. To this end, we propose a new method, direct simplification (DS) for imposing the sparsity of kernel regression machines. Different to the existing sparse methods, DS performs approximation and optimization in a unified framework by incrementally finding a set of basis functions that minimizes the primal risk function directly. The main advantage of our method lies in its ability to form very good approximations for kernel regression machines with a clear control on the computation complexity as well as the training time. Experiments on two real time series and two benchmarks assess the feasibility of our method and show that DS can obtain better performance with fewer bases compared with two-step-type sparse method. (C) 2008 Elsevier B.V. All rights reserved.
引用
收藏
页码:3602 / 3606
页数:5
相关论文
共 50 条
  • [41] Kernel machines with missing responses
    Liu, Tiantian
    Goldberg, Yair
    ELECTRONIC JOURNAL OF STATISTICS, 2020, 14 (02): : 3766 - 3820
  • [42] Constraint Verification With Kernel Machines
    Gori, Marco
    Melacci, Stefano
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (05) : 825 - 831
  • [43] Sparse Representation in Kernel Machines
    Sun, Hongwei
    Wu, Qiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (10) : 2576 - 2582
  • [44] Kernel machines with missing covariates
    Liu, Tiantian
    Goldberg, Yair
    ELECTRONIC JOURNAL OF STATISTICS, 2023, 17 (02): : 2485 - 2538
  • [45] Nonlinear knowledge in kernel machines
    Mangasarian, Olvi L.
    Wild, Edward W.
    DATA MINING AND MATHEMATICAL PROGRAMMING, 2008, 45 : 181 - 198
  • [46] Conjugate gradients for kernel machines
    Bartels, Simon
    Hennig, Philipp
    Journal of Machine Learning Research, 2020, 21
  • [47] Kernel machines and Boolean functions
    Kowalczyk, A
    Smola, AJ
    Williamson, RC
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14, VOLS 1 AND 2, 2002, 14 : 439 - 446
  • [48] Conjugate Gradients for Kernel Machines
    Bartels, Simon
    Hennig, Philipp
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [50] Towards a simplification of models using regression trees
    Eynaud, Y.
    Nerini, D.
    Baklouti, M.
    Poggiale, J. -C.
    JOURNAL OF THE ROYAL SOCIETY INTERFACE, 2013, 10 (79)