A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking

被引:7
|
作者
Hu Lei [1 ]
Yi Guoxing [1 ]
Huang Chao [1 ]
机构
[1] Harbin Inst Technol, Sch Astronaut, Harbin 150001, Peoples R China
关键词
least square support vector regression (LSSVR); global representative point ranking (GRPR); initial training dataset; pruning strategy; sparsity; regression accuracy; CROSS-VALIDATION;
D O I
10.23919/JSEE.2021.000014
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Least square support vector regression (LSSVR) is a method for function approximation, whose solutions are typically non-sparse, which limits its application especially in some occasions of fast prediction. In this paper, a sparse algorithm for adaptive pruning LSSVR algorithm based on global representative point ranking (GRPR-AP-LSSVR) is proposed. At first, the global representative point ranking (GRPR) algorithm is given, and relevant data analysis experiment is implemented which depicts the importance ranking of data points. Furthermore, the pruning strategy of removing two samples in the decremental learning procedure is designed to accelerate the training speed and ensure the sparsity. The removed data points are utilized to test the temporary learning model which ensures the regression accuracy. Finally, the proposed algorithm is verified on artificial datasets and UCI regression datasets, and experimental results indicate that, compared with several benchmark algorithms, the GRPR-AP-LSSVR algorithm has excellent sparsity and prediction speed without impairing the generalization performance.
引用
收藏
页码:151 / 162
页数:12
相关论文
共 50 条
  • [1] A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking
    HU Lei
    YI Guoxing
    HUANG Chao
    [J]. Journal of Systems Engineering and Electronics, 2021, 32 (01) : 151 - 162
  • [2] A fast sparse algorithm for least squares support vector machine based on global representative points
    Ma Y.-F.
    Liang X.
    Zhou X.-P.
    [J]. Zidonghua Xuebao/Acta Automatica Sinica, 2017, 43 (01): : 132 - 141
  • [3] Adaptive and iterative training algorithm of least square support vector machine regression
    Yang, Bin
    Yang, Xiao-Wei
    Huang, Lan
    Liang, Yan-Chun
    Zhou, Chun-Guang
    Wu, Chun-Guo
    [J]. Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2010, 38 (07): : 1621 - 1625
  • [4] Sparse least square support vector machine via coupled compressive pruning
    Yang, Lixia
    Yang, Shuyuan
    Zhang, Rui
    Jin, HongHong
    [J]. NEUROCOMPUTING, 2014, 131 : 77 - 86
  • [5] Improved pruning algorithms for sparse least squares support vector regression machine
    Zhao, Yong-Ping
    Sun, Jian-Guo
    [J]. Xitong Gongcheng Lilun yu Shijian/System Engineering Theory and Practice, 2009, 29 (06): : 166 - 171
  • [6] Improved adaptive pruning algorithm for least squares support vector regression
    Runpeng Gao and Ye San Control and Simulation Center
    [J]. Journal of Systems Engineering and Electronics, 2012, 23 (03) : 438 - 444
  • [7] Improved adaptive pruning algorithm for least squares support vector regression
    Gao, Runpeng
    San, Ye
    [J]. JOURNAL OF SYSTEMS ENGINEERING AND ELECTRONICS, 2012, 23 (03) : 438 - 444
  • [8] Sparse least square twin support vector machine with adaptive norm
    Zhiqiang Zhang
    Ling Zhen
    Naiyang Deng
    Junyan Tan
    [J]. Applied Intelligence, 2014, 41 : 1097 - 1107
  • [9] Sparse least square twin support vector machine with adaptive norm
    Zhang, Zhiqiang
    Zhen, Ling
    Deng, Naiyang
    Tan, Junyan
    [J]. APPLIED INTELLIGENCE, 2014, 41 (04) : 1097 - 1107
  • [10] Adaptive pruning algorithm for least squares support vector machine classifier
    Xiaowei Yang
    Jie Lu
    Guangquan Zhang
    [J]. Soft Computing, 2010, 14 : 667 - 680