Iteratively reweighted least square for asymmetric L 2-Loss support vector regression

被引:1
|
作者
Zheng, Songfeng [1 ]
机构
[1] Missouri State Univ, Dept Math, Springfield, MO 65897 USA
关键词
Support vector regression; Squared similar to-insensitive loss; function; Weighted least; square; Quadratic programing; MACHINE; ROBUSTNESS;
D O I
10.1080/03610918.2019.1599016
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In support vector regression (SVR) model, using the squared E -insensitive loss function makes the objective function of the optimization problem strictly convex and yields a more concise solution. However, the formulation leads to a quadratic programing which is expensive to solve. This paper reformulates the optimization problem by absorbing the constraints in the objective function, and the new formulation shares similarity with weighted least square regression problem. Based on this formulation, we propose an iteratively reweighted least square approach to train the L-2-loss SVR, for both linear and nonlinear models. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency.
引用
收藏
页码:2151 / 2167
页数:17
相关论文
共 50 条
  • [1] Iteratively reweighted least square for kernel expectile regression with random features
    Cui, Yue
    Zheng, Songfeng
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2023, 93 (14) : 2370 - 2389
  • [2] NonConvex Iteratively Reweighted Least Square Optimization in Compressive Sensing
    Chakraborty, Madhuparna
    Barik, Alaka
    Nath, Ravinder
    Dutta, Victor
    MATERIAL AND MANUFACTURING TECHNOLOGY II, PTS 1 AND 2, 2012, 341-342 : 629 - +
  • [3] Improved Convergence for l∞ and l1 Regression via Iteratively Reweighted Least Squares
    Ene, Alina
    Vladu, Adrian
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] INCREMENTAL LOCALIZATION ALGORITHM BASED ON REGULARIZED ITERATIVELY REWEIGHTED LEAST SQUARE
    Yan, Xiaoyong
    Yang, Zhong
    Liu, Yu
    Xu, Xiaoduo
    Li, Huijun
    FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2016, 41 (03) : 183 - 196
  • [5] Incremental localization algorithm based on regularized iteratively reweighted least square
    Yan, Xiaoyong
    Song, Aiguo
    Liu, Yu
    He, Jian
    Zhu, Ronghui
    2015 IEEE INTERNATIONAL CONFERENCE ON SMART CITY/SOCIALCOM/SUSTAINCOM (SMARTCITY), 2015, : 729 - 733
  • [6] A Novel Least Square Twin Support Vector Regression
    Zhang, Zhiqiang
    Lv, Tongling
    Wang, Hui
    Liu, Liming
    Tan, Junyan
    NEURAL PROCESSING LETTERS, 2018, 48 (02) : 1187 - 1200
  • [7] Fuzzy least square support vector machines for regression
    Wu, Qing
    Liu, San-Yang
    Du, Zhe
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2007, 34 (05): : 773 - 778
  • [8] A Novel Least Square Twin Support Vector Regression
    Zhiqiang Zhang
    Tongling Lv
    Hui Wang
    Liming Liu
    Junyan Tan
    Neural Processing Letters, 2018, 48 : 1187 - 1200
  • [9] Total Least Square Support Vector Machine for Regression
    Fu, Guanghui
    Hu, Guanghua
    INTERNATIONAL CONFERENCE ON INTELLIGENT COMPUTATION TECHNOLOGY AND AUTOMATION, VOL 1, PROCEEDINGS, 2008, : 271 - 275
  • [10] ROBUST REGRESSION COMPUTATION USING ITERATIVELY REWEIGHTED LEAST-SQUARES
    OLEARY, DP
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 1990, 11 (03) : 466 - 480