Iteratively reweighted least square for asymmetric L 2-Loss support vector regression

被引:1
|
作者
Zheng, Songfeng [1 ]
机构
[1] Missouri State Univ, Dept Math, Springfield, MO 65897 USA
关键词
Support vector regression; Squared similar to-insensitive loss; function; Weighted least; square; Quadratic programing; MACHINE; ROBUSTNESS;
D O I
10.1080/03610918.2019.1599016
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In support vector regression (SVR) model, using the squared E -insensitive loss function makes the objective function of the optimization problem strictly convex and yields a more concise solution. However, the formulation leads to a quadratic programing which is expensive to solve. This paper reformulates the optimization problem by absorbing the constraints in the objective function, and the new formulation shares similarity with weighted least square regression problem. Based on this formulation, we propose an iteratively reweighted least square approach to train the L-2-loss SVR, for both linear and nonlinear models. The proposed approach is easy to implement, without requiring any additional computing package other than basic linear algebra operations. Numerical studies on real-world datasets show that, compared to the alternatives, the proposed approach can achieve similar prediction accuracy with substantially higher time efficiency.
引用
收藏
页码:2151 / 2167
页数:17
相关论文
共 50 条