Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model

被引:12
|
作者
Wang, Xunxian [1 ]
Lowe, David [1 ]
Chen, Sheng [2 ]
Harris, Chris J. [2 ]
机构
[1] Aston Univ, Neural Comp Res Grp, Birmingham B4 7ET, W Midlands, England
[2] Univ Southampton, Sch Elect & Comp Sci, Southampton SO17 1BJ, Hants, England
关键词
generalised kernel model; least squares support vector machine; orthogonal least squares forward selection; regression; sparse modelling;
D O I
10.1504/IJMIC.2006.012612
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate number of regressors, the model weight parameters are calculated from the Lagrange dual problem of the original least squares problem. Different from the least squares support vector regression, this regression modelling procedure involves neither reproducing kernel Hilbert space nor Mercer decomposition concepts. As the regressors used are not restricted to be positioned at training input points and each regressor has its own diagonal covariance matrix, a very sparse representation can be obtained with excellent generalisation capability. Experimental results involving two real data sets demonstrate the effectiveness of the proposed regression modelling approach.
引用
收藏
页码:245 / 256
页数:12
相关论文
共 50 条
  • [21] Parsimonious Extreme Learning Machine Using Recursive Orthogonal Least Squares
    Wang, Ning
    Er, Meng Joo
    Han, Min
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (10) : 1828 - 1841
  • [22] Least Squares Support Vector Machine Regression Based on Sparse Samples and Mixture Kernel Learning
    Ma, Wenlu
    Liu, Han
    INFORMATION TECHNOLOGY AND CONTROL, 2021, 50 (02): : 319 - 331
  • [23] LEAST SQUARES TWIN PROJECTION SUPPORT VECTOR REGRESSION
    Gu, Binjie
    Shen, Geliang
    Pan, Feng
    Chen, Hao
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2019, 15 (06): : 2275 - 2288
  • [24] Revenue forecasting using a least-squares support vector regression model in a fuzzy environment
    Lin, Kuo-Ping
    Pai, Ping-Feng
    Lu, Yu-Ming
    Chang, Ping-Teng
    INFORMATION SCIENCES, 2013, 220 : 196 - 209
  • [25] Primal least squares twin support vector regression
    Huang, Hua-juan
    Ding, Shi-fei
    Shi, Zhong-zhi
    JOURNAL OF ZHEJIANG UNIVERSITY-SCIENCE C-COMPUTERS & ELECTRONICS, 2013, 14 (09): : 722 - 732
  • [27] Intuitionistic fuzzy C-regression by using least squares support vector regression
    Lin, Kuo-Ping
    Chang, Hao-Feng
    Chen, Tung-Lian
    Lu, Yu-Ming
    Wang, Ching-Hsin
    EXPERT SYSTEMS WITH APPLICATIONS, 2016, 64 : 296 - 304
  • [28] Complete subset least squares support vector regression
    Qiu, Yue
    ECONOMICS LETTERS, 2021, 200
  • [29] Primal least squares twin support vector regression
    Huang, Hua-Juan
    Ding, Shi-Fei
    Shi, Zhong-Zhi
    Journal of Zhejiang University: Science C, 2013, 14 (09): : 722 - 732
  • [30] Mapped least squares support vector machine regression
    Zheng, S
    Sun, YQ
    Tian, JW
    Liu, J
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2005, 19 (03) : 459 - 475