Support Vector Regression for the simultaneous learning of a multivariate function and its derivatives

被引:25
|
作者
Lázaro, M
Santamaría, I
Pérez-Cruz, F
Artés-Rodríguez, A
机构
[1] Univ Carlos III Madrid, Dept Teor Senal & Comun, Madrid 28911, Spain
[2] Univ Cantabria, Dept Ingn Comun, E-39005 Santander, Spain
[3] UCL, Gatsby Computat Neurosci Unit, London WC1N 3AR, England
关键词
SVM; IRWLS;
D O I
10.1016/j.neucom.2005.02.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, the problem of simultaneously approximating a function and its derivatives is formulated within the Support Vector Machine (SVM) framework. First, the problem is solved for a one-dimensional input space by using the epsilon-insensitive loss function and introducing additional constraints in the approximation of the derivative. Then, we extend the method to multi-dimensional input spaces by a multidimensional regression algorithm. In both cases, to optimize the regression estimation problem, we have derived an iterative reweighted least squares (IRWLS) procedure that works fast for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function. (c) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:42 / 61
页数:20
相关论文
共 50 条
  • [1] Twin support vector regression for the simultaneous learning of a function and its derivatives
    Reshma Khemchandani
    Anuj Karpatne
    Suresh Chandra
    [J]. International Journal of Machine Learning and Cybernetics, 2013, 4 : 51 - 63
  • [2] Twin support vector regression for the simultaneous learning of a function and its derivatives
    Khemchandani, Reshma
    Karpatne, Anuj
    Chandra, Suresh
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2013, 4 (01) : 51 - 63
  • [3] Regularized least squares support vector regression for the simultaneous learning of a function and its derivatives
    Jayadeva
    Khemchandani, Reshma
    Chandra, Suresh
    [J]. INFORMATION SCIENCES, 2008, 178 (17) : 3402 - 3414
  • [4] Generalized eigenvalue proximal support vector regressor for the simultaneous learning of a function and its derivatives
    Khemchandani, Reshma
    Goyal, Keshav
    Chandra, Suresh
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (12) : 2059 - 2070
  • [5] Generalized eigenvalue proximal support vector regressor for the simultaneous learning of a function and its derivatives
    Reshma Khemchandani
    Keshav Goyal
    Suresh Chandra
    [J]. International Journal of Machine Learning and Cybernetics, 2018, 9 : 2059 - 2070
  • [6] Least Square Support Vector Machine for the Simultaneous Learning of a Function and Its Derivative
    Zhang, Rui
    Liu, Guozhen
    [J]. ADVANCED RESEARCH ON ELECTRONIC COMMERCE, WEB APPLICATION, AND COMMUNICATION, PT 1, 2011, 143 : 427 - +
  • [7] Simultaneous approximation of a multivariate function and its derivatives by multilinear splines
    Anderson, Ryan
    Babenko, Yuliya
    Leskevyeh, Tetiana
    [J]. JOURNAL OF APPROXIMATION THEORY, 2014, 183 : 82 - 97
  • [8] Support vector machine for the simultaneous approximation of a function and its derivative
    Lázaro, M
    Santamaría, I
    Perez-Cruz, F
    Artés-Rodríguez, A
    [J]. 2003 IEEE XIII WORKSHOP ON NEURAL NETWORKS FOR SIGNAL PROCESSING - NNSP'03, 2003, : 189 - 198
  • [9] Construction on orthogonal multiwavelets for derivatives of multivariate vector scaling function with compact support
    Feng, XX
    Cheng, ZX
    [J]. WAVELET ANALYSIS AND ITS APPLICATIONS (WAA), VOLS 1 AND 2, 2003, : 701 - 706
  • [10] Support vector regression in sum space for multivariate calibration
    Peng, Jiangtao
    Li, Luoqing
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2014, 130 : 14 - 19