Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model

被引:12
|
作者
Wang, Xunxian [1 ]
Lowe, David [1 ]
Chen, Sheng [2 ]
Harris, Chris J. [2 ]
机构
[1] Aston Univ, Neural Comp Res Grp, Birmingham B4 7ET, W Midlands, England
[2] Univ Southampton, Sch Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
关键词
generalised kernel model; least squares support vector machine; orthogonal least squares forward selection; regression; sparse modelling;
D O I
10.1504/IJMIC.2006.012612
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.
引用
收藏
页码:245 / 256
页数:12
相关论文
共 50 条
  • [31] Primal least squares twin support vector regression
    Hua-juan Huang
    Shi-fei Ding
    Zhong-zhi Shi
    Journal of Zhejiang University SCIENCE C, 2013, 14 : 722 - 732
  • [32] Recursive reduced least squares support vector regression
    Zhao, Yongping
    Sun, Jianguo
    PATTERN RECOGNITION, 2009, 42 (05) : 837 - 842
  • [33] Primal least squares twin support vector regression
    Hua-juan HUANG
    Shi-fei DING
    Zhong-zhi SHI
    JournalofZhejiangUniversity-ScienceC(Computers&Electronics), 2013, 14 (09) : 722 - 732
  • [34] Robust Lp-norm least squares support vector regression with feature selection
    Ye, Ya-Fen
    Shao, Yuan-Hai
    Deng, Nai-Yang
    Li, Chun-Na
    Hua, Xiang-Yu
    APPLIED MATHEMATICS AND COMPUTATION, 2017, 305 : 32 - 52
  • [35] A robust weighted least squares support vector regression based on least trimmed squares
    Chen, Chuanfa
    Yan, Changqing
    Li, Yanyan
    NEUROCOMPUTING, 2015, 168 : 941 - 946
  • [36] Twin Least Squares Support Vector Regression of Heteroscedastic Gaussian Noise Model
    Zhang, Shiguang
    Liu, Chao
    Zhou, Ting
    Sun, Lin
    IEEE ACCESS, 2020, 8 : 94076 - 94088
  • [37] Construction of Upper Boundary Model Based on Least Squares Support Vector Regression
    Liu, Xiaoyong
    Zeng, Chengbin
    Liu, Yun
    He, Guofeng
    Yan, Genglong
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2024, 52 (12): : 139 - 150
  • [38] Direction of Arrival Based on the Multioutput Least Squares Support Vector Regression Model
    Huang, Kai
    You, Ming-Yi
    Ye, Yun-Xia
    Jiang, Bin
    Lu, An-Nan
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2020, 2020 (2020)
  • [39] A Novel Least Squares Support Vector Machine Kernel for Approximation
    Mu, Xiangyang
    Gao, Weixin
    Tang, Nan
    Zhou, Yatong
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 4510 - +
  • [40] Unbiased least squares support vector machine with polynomial kernel
    Zhang, Meng
    Fu, Lihua
    2006 8TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, VOLS 1-4, 2006, : 1943 - +