Opening the kernel of kernel partial least squares and support vector machines

被引:46
|
作者
Postma, G. J. [1 ]
Krooshof, P. W. T. [1 ]
Buydens, L. M. C. [1 ]
机构
[1] Radboud Univ Nijmegen, Inst Mol & Mat, NL-6500 GL Nijmegen, Netherlands
关键词
Kernel partial least squares; Support vector regression; Kernel transformation; variable selection; Pseudo-samples; Trajectories; REGRESSION; PLS; CLASSIFICATION; PREDICTION; TOOL;
D O I
10.1016/j.aca.2011.04.025
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Kernel partial least squares (KPLS) and support vector regression (SVR) have become popular techniques for regression of complex non-linear data sets. The modeling is performed by mapping the data in a higher dimensional feature space through the kernel transformation. The disadvantage of such a transformation is, however, that information about the contribution of the original variables in the regression is lost. In this paper we introduce a method which can retrieve and visualize the contribution of the variables to the regression model and the way the variables contribute to the regression of complex data sets. The method is based on the visualization of trajectories using so-called pseudo samples representing the original variables in the data. We test and illustrate the proposed method to several synthetic and real benchmark data sets. The results show that for linear and non-linear regression models the important variables were identified with corresponding linear or non-linear trajectories. The results were verified by comparing with ordinary PLS regression and by selecting those variables which were indicated as important and rebuilding a model with only those variables. (C) 2011 Elsevier B.V. All rights reserved.
引用
收藏
页码:123 / 134
页数:12
相关论文
共 50 条
  • [31] An efficient Kernel-based matrixized least squares support vector machine
    Zhe Wang
    Xisheng He
    Daqi Gao
    Xiangyang Xue
    Neural Computing and Applications, 2013, 22 : 143 - 150
  • [32] Least squares support vector machine on Gaussian wavelet kernel function set
    Wu, Fangfang
    Zhao, Yinliang
    ADVANCES IN NEURAL NETWORKS - ISNN 2006, PT 1, 2006, 3971 : 936 - 941
  • [33] Using partial least squares and support vector machines for bankruptcy prediction
    Yang, Zijiang
    You, Wenjie
    Ji, Guoli
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (07) : 8336 - 8342
  • [34] ACTION RECOGNITION USING PARTIAL LEAST SQUARES AND SUPPORT VECTOR MACHINES
    Ramadan, Samah
    Davis, Larry
    2011 18TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2011, : 533 - 536
  • [35] The Kernel Recursive Least Squares CMAC with Vector Eligibility
    Carl Laufer
    Nitish Patel
    George Coghill
    Neural Processing Letters, 2014, 39 : 269 - 284
  • [36] The Kernel Recursive Least Squares CMAC with Vector Eligibility
    Laufer, Carl
    Patel, Nitish
    Coghill, George
    NEURAL PROCESSING LETTERS, 2014, 39 (03) : 269 - 284
  • [37] A Novel Extension of Kernel Partial Least Squares Regression
    贾金明
    仲伟俊
    Journal of Donghua University(English Edition), 2009, 26 (04) : 438 - 442
  • [38] Optimal Learning Rates for Kernel Partial Least Squares
    Shao-Bo Lin
    Ding-Xuan Zhou
    Journal of Fourier Analysis and Applications, 2018, 24 : 908 - 933
  • [39] A novel extension of kernel partial least squares regression
    Jia, Jin-Ming
    Zhong, Wei-Jun
    Journal of Donghua University (English Edition), 2009, 26 (04) : 438 - 442
  • [40] Prediction for Biodegradability of Chemicals by Kernel Partial Least Squares
    Hiromatsu, Koichi
    Takahara, Jun-ichi
    Nishihara, Tsutomu
    Okamoto, Kousuke
    Yasunaga, Teruo
    Ohmayu, Yoshihiro
    Takagi, Tatsuya
    Nakazono, Kingo
    JOURNAL OF COMPUTER AIDED CHEMISTRY, 2009, 10 : 1 - 9