Improved adaptive pruning algorithm for least squares support vector regression

被引:3
|
作者
Gao, Runpeng [1 ]
San, Ye [1 ]
机构
[1] Harbin Inst Technol, Control & Simulat Ctr, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
least squares support vector regression machine (LS-SVRM); pruning; leave-one-out (LOO) error; incremental learning; decremental learning; ERROR MINIMIZATION; SCHEME;
D O I
10.1109/JSEE.2012.00055
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As the solutions of the least squares support vector regression machine (LS-SVRM) are not sparse, it leads to slow prediction speed and limits its applications. The defects of the existing adaptive pruning algorithm for LS-SVRM are that the training speed is slow, and the generalization performance is not satisfactory, especially for large scale problems. Hence an improved algorithm is proposed. In order to accelerate the training speed, the pruned data point and fast leave-one-out error are employed to validate the temporary model obtained after decremental learning. The novel objective function in the termination condition which involves the whole constraints generated by all training data points and three pruning strategies are employed to improve the generalization performance. The effectiveness of the proposed algorithm is tested on six benchmark datasets. The sparse LS-SVRM model has a faster training speed and better generalization performance.
引用
收藏
页码:438 / 444
页数:7
相关论文
共 50 条
  • [31] Complete subset least squares support vector regression
    Qiu, Yue
    [J]. ECONOMICS LETTERS, 2021, 200
  • [32] Primal least squares twin support vector regression
    Hua-juan Huang
    Shi-fei Ding
    Zhong-zhi Shi
    [J]. Journal of Zhejiang University SCIENCE C, 2013, 14 : 722 - 732
  • [33] Mapped least squares support vector machine regression
    Zheng, S
    Sun, YQ
    Tian, JW
    Liu, J
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2005, 19 (03) : 459 - 475
  • [34] Recursive reduced least squares support vector regression
    Zhao, Yongping
    Sun, Jianguo
    [J]. PATTERN RECOGNITION, 2009, 42 (05) : 837 - 842
  • [35] Primal least squares twin support vector regression
    Hua-juan HUANG
    Shi-fei DING
    Zhong-zhi SHI
    [J]. Journal of Zhejiang University-Science C(Computers & Electronics), 2013, 14 (09) : 722 - 732
  • [36] An Incremental Learning Algorithm for Improved Least Squares Twin Support Vector Machine
    Yang, Ling
    Liu, Kai
    Liang, Xiaodong
    Ma, Tao
    [J]. 2012 IEEE FIFTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2012, : 463 - 467
  • [37] Least squares support vector machines based on improved clonal selection algorithm
    Xu, Yong
    Zhang, Guang-Hui
    Qian, Feng
    [J]. Huadong Ligong Daxue Xuebao /Journal of East China University of Science and Technology, 2008, 34 (05): : 729 - 733
  • [38] A robust weighted least squares support vector regression based on least trimmed squares
    Chen, Chuanfa
    Yan, Changqing
    Li, Yanyan
    [J]. NEUROCOMPUTING, 2015, 168 : 941 - 946
  • [39] Adaptive and Iterative Least Squares Support Vector Regression Based on Quadratic Renyi Entropy
    Jiang, Jingqing
    Song, Chuyi
    Zhao, Haiyan
    Wu, Chunguo
    Liang, Yanchun
    [J]. 2008 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, VOLS 1 AND 2, 2008, : 340 - +
  • [40] Multi-output Online Adaptive Least Squares Support Vector Regression Learning
    Chen, Yao
    Liu, Xianhui
    Zhao, Weidong
    [J]. INTERNATIONAL CONFERENCE ON COMPUTATIONAL AND INFORMATION SCIENCES (ICCIS 2014), 2014, : 718 - 723