Recursive robust least squares support vector regression based on maximum correntropy criterion

被引:67
|
作者
Chen, Xiaobo [1 ,2 ]
Yang, Jian [2 ]
Liang, Jun [3 ]
Ye, Qiaolin [2 ]
机构
[1] Jiangsu Univ, Sch Comp Sci & Telecommun Engn, Zhenjiang 212013, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Technol, Nanjing 210094, Jiangsu, Peoples R China
[3] Jiangsu Univ, Automot Engn Res Inst, Zhenjiang 212013, Peoples R China
基金
中国国家自然科学基金;
关键词
Support vector machine; Correntropy; Robust regression; Particle swarm optimization;
D O I
10.1016/j.neucom.2012.05.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Least squares support vector machine for regression (LSSVR) is an efficient method for function estimation problem. However, its solution is prone to large noise and outliers since it depends on the minimum of the sum of squares error (SSE) on training samples. To tackle this problem, in this paper, a novel regression model termed as recursive robust LSSVR ((RLSSVR)-L-2) is proposed to obtain robust estimation for data in the presence of outliers. The idea is to build a regression model in the kernel space based on maximum correntropy criterion and regularization technique. An iterative algorithm derived from half-quadratic optimization is further developed to solve (RLSSVR)-L-2 with theoretically guaranteed convergence. It also reveals that (RLSSVR)-L-2 is closely related to the original LSSVR since it essentially solves adaptive weighted LSSVR iteratively. Furthermore, a hyperparameters selection method for (RLSSVR)-L-2 is presented based on particle swarm optimization (PSO) such that multiple hyperparameters in (RLSSVR)-L-2 can be estimated effectively for better performance. The feasibility of this method is examined on some simulated and benchmark datasets. The experimental results demonstrate the good robust performance of the proposed method. (C) 2012 Elsevier B.V. All rights reserved.
引用
收藏
页码:63 / 73
页数:11
相关论文
共 50 条
  • [1] Robust Proximal Support Vector Regression Based on Maximum Correntropy Criterion
    Wang, Kuaini
    Pei, Huimin
    Ding, Xiaoshuai
    Zhong, Ping
    [J]. SCIENTIFIC PROGRAMMING, 2019, 2019
  • [2] Recursive least squares support vector regression
    Li, Lijuan
    Su, Hongye
    Chu, Jian
    [J]. DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13E : 2671 - 2675
  • [3] Maximum correntropy criterion partial least squares
    Mou, Yi
    Zhou, Long
    Chen, Weizhen
    Fan, Jijun
    Zhao, Xu
    [J]. OPTIK, 2018, 165 : 137 - 147
  • [4] Recursive reduced least squares support vector regression
    Zhao, Yongping
    Sun, Jianguo
    [J]. PATTERN RECOGNITION, 2009, 42 (05) : 837 - 842
  • [5] A robust weighted least squares support vector regression based on least trimmed squares
    Chen, Chuanfa
    Yan, Changqing
    Li, Yanyan
    [J]. NEUROCOMPUTING, 2015, 168 : 941 - 946
  • [6] An improved recursive reduced least squares support vector regression
    Zhao, Yong-Ping
    Sun, Jian-Guo
    Du, Zhong-Hua
    Zhang, Zhi-An
    Zhang, Yu-Chen
    Zhang, Hai-Bo
    [J]. NEUROCOMPUTING, 2012, 87 : 1 - 9
  • [7] Robust least squares support vector machine based on recursive outlier elimination
    Wen Wen
    Zhifeng Hao
    Xiaowei Yang
    [J]. Soft Computing, 2010, 14 : 1241 - 1251
  • [8] Quantized generalized maximum correntropy criterion based kernel recursive least squares for online time series prediction
    Shen, Tianyu
    Ren, Weijie
    Han, Min
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 95
  • [9] Robust least squares support vector machine based on recursive outlier elimination
    Wen, Wen
    Hao, Zhifeng
    Yang, Xiaowei
    [J]. SOFT COMPUTING, 2010, 14 (11) : 1241 - 1251
  • [10] Multikernel correntropy based robust least squares one-class support vector machine
    Zheng, Yunfei
    Wang, Shiyuan
    Chen, Badong
    [J]. NEUROCOMPUTING, 2023, 545