A regularization on Lagrangian twin support vector regression

被引:34
|
作者
Tanveer, M. [1 ]
Shubham, K. [2 ]
机构
[1] LNM Inst Informat Technol, Dept Comp Sci & Engn, Jaipur 302031, Rajasthan, India
[2] LNM Inst Informat Technol, Dept Elect & Commun Engn, Jaipur 302031, Rajasthan, India
关键词
Machine learning; Lagrangian support vector machines; Twin support vector regression; Iterative method; MACHINE; CLASSIFICATION;
D O I
10.1007/s13042-015-0361-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Twin support vector regression (TSVR), Lagrangian TSVR (LTSVR) and -TSVR obtain good generalization and faster computational speed by solving a pair of smaller sized quadratic programming problems (QPPs) than a single large QPP in support vector regression (SVR). In this paper, a simple and linearly convergent Lagrangian support vector machine algorithm for the dual of the -TSVR is proposed. The contributions of our formulation are as follows: (1) we consider the square of the 2-norm of the vector of slack variables instead of the usual 1-norm to make the objective functions strongly convex. (2) We are solving regression problem with just two systems of linear equations as opposed to solving two QPPs in -TSVR and TSVR or one large QPP in SVR, which leads to extremely simple and fast algorithm. (3) One significant advantage of our proposed method is the implementation of structural risk minimization principle. However, only empirical risk is considered in the primal problems of TSVR and LTSVR due to its complex structure and thus may incur overfitting and suboptimal in some cases. (4) The experimental results on several artificial and benchmark datasets show the effectiveness of our proposed formulation.
引用
收藏
页码:807 / 821
页数:15
相关论文
共 50 条
  • [1] A regularization on Lagrangian twin support vector regression
    M. Tanveer
    K. Shubham
    International Journal of Machine Learning and Cybernetics, 2017, 8 : 807 - 821
  • [2] On Lagrangian twin support vector regression
    Balasundaram, S.
    Tanveer, M.
    NEURAL COMPUTING & APPLICATIONS, 2013, 22 : S257 - S267
  • [3] On Lagrangian twin support vector regression
    S. Balasundaram
    M. Tanveer
    Neural Computing and Applications, 2013, 22 : 257 - 267
  • [4] Incremental learning for Lagrangian ε-twin support vector regression
    Binjie Gu
    Jie Cao
    Feng Pan
    Weili Xiong
    Soft Computing, 2023, 27 : 5357 - 5375
  • [5] Incremental learning for Lagrangian ε-twin support vector regression
    Gu, Binjie
    Cao, Jie
    Pan, Feng
    Xiong, Weili
    SOFT COMPUTING, 2023, 27 (09) : 5357 - 5375
  • [6] An improved regularization based Lagrangian asymmetric ν-twin support vector regression using pinball loss function
    Umesh Gupta
    Deepak Gupta
    Applied Intelligence, 2019, 49 : 3606 - 3627
  • [7] An improved regularization based Lagrangian asymmetric ν-twin support vector regression using pinball loss function
    Gupta, Umesh
    Gupta, Deepak
    APPLIED INTELLIGENCE, 2019, 49 (10) : 3606 - 3627
  • [8] On implicit Lagrangian twin support vector regression by Newton method
    Balasundaram, S.
    Gupta, Deepak
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2014, 7 (01) : 50 - 64
  • [9] On Regularization Based Twin Support Vector Regression with Huber Loss
    Umesh Gupta
    Deepak Gupta
    Neural Processing Letters, 2021, 53 : 459 - 515
  • [10] An efficient implicit regularized Lagrangian twin support vector regression
    M. Tanveer
    K. Shubham
    M. Aldhaifallah
    K. S. Nisar
    Applied Intelligence, 2016, 44 : 831 - 848