Opening the kernel of kernel partial least squares and support vector machines

被引:46
|
作者
Postma, G. J. [1 ]
Krooshof, P. W. T. [1 ]
Buydens, L. M. C. [1 ]
机构
[1] Radboud Univ Nijmegen, Inst Mol & Mat, NL-6500 GL Nijmegen, Netherlands
关键词
Kernel partial least squares; Support vector regression; Kernel transformation; variable selection; Pseudo-samples; Trajectories; REGRESSION; PLS; CLASSIFICATION; PREDICTION; TOOL;
D O I
10.1016/j.aca.2011.04.025
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Kernel partial least squares (KPLS) and support vector regression (SVR) have become popular techniques for regression of complex non-linear data sets. The modeling is performed by mapping the data in a higher dimensional feature space through the kernel transformation. The disadvantage of such a transformation is, however, that information about the contribution of the original variables in the regression is lost. In this paper we introduce a method which can retrieve and visualize the contribution of the variables to the regression model and the way the variables contribute to the regression of complex data sets. The method is based on the visualization of trajectories using so-called pseudo samples representing the original variables in the data. We test and illustrate the proposed method to several synthetic and real benchmark data sets. The results show that for linear and non-linear regression models the important variables were identified with corresponding linear or non-linear trajectories. The results were verified by comparing with ordinary PLS regression and by selecting those variables which were indicated as important and rebuilding a model with only those variables. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:123 / 134
页数:12
相关论文
共 50 条
  • [1] Randomized kernel methods for least-squares support vector machines
    Andrecut, M.
    INTERNATIONAL JOURNAL OF MODERN PHYSICS C, 2017, 28 (02):
  • [2] Kernel canonical correlation analysis and least squares Support Vector Machines
    Van Gestel, T
    Suykens, JAK
    De Brabanter, J
    De Moor, B
    Vandewalle, J
    ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS, 2001, 2130 : 384 - 389
  • [3] Direct kernel least-squares support vector machines with heuristic regularization
    Embrechts, MJ
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 687 - 692
  • [4] Application of a scaling kernel in signal approximation of least squares support vector machines
    Mu, Xiangyang
    Zhang, Taiyi
    Zhou, Yatong
    Hsi-An Chiao Tung Ta Hsueh/Journal of Xi'an Jiaotong University, 2008, 42 (12): : 1464 - 1467
  • [5] Reduced least squares support vector based on kernel partial least squares and its application research
    Song Haiying
    Gui Weihua
    Yang Chunhua
    PROCEEDINGS OF THE 26TH CHINESE CONTROL CONFERENCE, VOL 3, 2007, : 207 - +
  • [6] A Novel Kernel for Least Squares Support Vector Machine
    冯伟
    赵永平
    杜忠华
    李德才
    王立峰
    Defence Technology, 2012, (04) : 240 - 247
  • [7] Selection of kernel function for least squares support vector machines in downburst wind speed forecasting
    Li, Zhou
    Li, Chunxiang
    2018 11TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), VOL 2, 2018, : 337 - 341
  • [8] A Novel Least Squares Support Vector Machine Kernel for Approximation
    Mu, Xiangyang
    Gao, Weixin
    Tang, Nan
    Zhou, Yatong
    2008 7TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-23, 2008, : 4510 - +
  • [9] Unbiased least squares support vector machine with polynomial kernel
    Zhang, Meng
    Fu, Lihua
    2006 8TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING, VOLS 1-4, 2006, : 1943 - +
  • [10] Approximate kernel partial least squares
    Xiling Liu
    Shuisheng Zhou
    Annals of Mathematics and Artificial Intelligence, 2020, 88 : 973 - 986