Fast Sparse Least Squares Support Vector Machines by Block Addition

被引:2
|
作者
Ebuchi, Fumito [1 ]
Kitamura, Takuya [1 ]
机构
[1] Natl Inst Technol, Toyama Coll, 13 Hongo Machi, Toyama, Toyama, Japan
来源
关键词
Empirical feature space; Least squares support vector machine; Pattern recognition;
D O I
10.1007/978-3-319-59072-1_8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose two fast feature selection methods for sparse least squares support vector training in reduced empirical feature space. In the first method, we select the training vectors as the basis vectors of the empirical feature space from the standpoint of the similarity. The complexity of the selection can be lower than that of the conventional method because we use the inner product values of training vectors without linear discriminant analysis or Cholesky factorization which are used by the conventional methods. In the second method, the selection method is forward selection by block addition which is a wrapper method. This method can decrease the size of the kernel matrix in the optimization problem. The selecting time can be shorter than that of the conventional methods because the computational complexity of the selecting basis vectors depends on the size of the kernel matrix. Using benchmark datasets, we show the effectiveness of the proposed methods.
引用
收藏
页码:60 / 70
页数:11
相关论文
共 50 条
  • [1] A Novel Sparse Least Squares Support Vector Machines
    Xia, Xiao-Lei
    Jiao, Weidong
    Li, Kang
    Irwin, George
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [2] A hybrid approach for sparse Least Squares Support Vector Machines
    de Carvalho, BPR
    Lacerda, WS
    Braga, AP
    [J]. HIS 2005: 5TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, : 323 - 328
  • [3] Improved sparse least-squares support vector machines
    Cawley, GC
    Talbot, NLC
    [J]. NEUROCOMPUTING, 2002, 48 : 1025 - 1031
  • [4] A hybrid approach for sparse least squares support vector machines
    [J]. De Carvalho, B.P.R. (bernardo@vettalabs.com), Operador Nacional do Sistema Eletrico - ONS; Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (Inst. of Elec. and Elec. Eng. Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States):
  • [5] Active Learning for Sparse Least Squares Support Vector Machines
    Zou, Junjie
    Yu, Zhengtao
    Zong, Huanyun
    Zhao, Xing
    [J]. ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT II, 2011, 7003 : 672 - +
  • [6] Sparse approximation using least squares support vector machines
    Suykens, JAK
    Lukas, L
    Vandewalle, J
    [J]. ISCAS 2000: IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS - PROCEEDINGS, VOL II: EMERGING TECHNOLOGIES FOR THE 21ST CENTURY, 2000, : 757 - 760
  • [7] Efficient Sparse Least Squares Support Vector Machines for Regression
    Si Gangquan
    Shi Jianquan
    Guo Zhang
    Zhao Weili
    [J]. 2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 5173 - 5178
  • [8] Fast sparse approximation for least squares support vector machine
    Jiao, Licheng
    Bo, Liefeng
    Wang, Ling
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03): : 685 - 697
  • [9] Training sparse least squares support vector machines by the QR decomposition
    Xia, Xiao-Lei
    [J]. NEURAL NETWORKS, 2018, 106 : 175 - 184
  • [10] A comparison of pruning algorithms for sparse least squares support vector machines
    Hoegaerts, L
    Suykens, JAK
    Vandewalle, J
    De Moor, B
    [J]. NEURAL INFORMATION PROCESSING, 2004, 3316 : 1247 - 1253