Extended twin parametric margin support vector regression

被引:0
|
作者
Sahleh, Ali [1 ]
Salahi, Maziar [1 ]
Eskandari, Sadegh [2 ]
Khodamoradi, Tahereh [1 ]
机构
[1] Univ Guilan, Fac Math Sci, Dept Appl Math, Namjoo Blvd, Rasht 41938-33697, Guilan, Iran
[2] Univ Guilan, Fac Math Sci, Dept Comp Sci, Namjoo Blvd, Rasht 41938 33697, Guilan, Iran
关键词
Support vector regression; Twin support vector regression; Hinge-loss function; Stock market prediction; FEATURE-EXTRACTION; CLASSIFICATION; MACHINE;
D O I
10.1007/s12597-024-00829-2
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Support Vector Regression (SVR) and its extensions have demonstrated effectiveness in addressing regression problems, yet they face challenges, including high computational costs for large-scale datasets and sensitivity to outliers. To mitigate these limitations, various techniques such as twin SVR (TWSVR) and robust TWSVR (RTWSVR) have been proposed. However, existing approaches may suffer from issues like alignment of the final regressor with the dataset during the learning processes. In this paper, we introduce an extended twin parametric margin SVR (ETPMSVR) model inspired by the principles of robust geometric TPMSVM (RGTPSVM). The ETPMSVR addresses these challenges by integrating the average of & varepsilon;\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\epsilon$$\end{document}-insensitive upper and lower bounds of the regressor into the objective function and constraints, ensuring alignment with the dataset and hence the final regerssor is found along with two boundary hyperplanes during the training process in a single quadratic programming problem. Additionally, a hinge-loss function is incorporated to enhance robustness against outliers. We derive the dual formulation to improve computational efficiency. Experiments on a diverse range of datasets, 10 UCI datasets and 8 S &P index datasets from financial market, demonstrate the efficacy of the proposed model in comparison to various benchmarks, including TWSVR, RTWSVR, multi-layer perceptron, and long short-term memory network.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Twin Support Vector Machine Based Regression
    Khemchandani, Reshma
    Goyal, Keshav
    Chandra, Suresh
    2015 EIGHTH INTERNATIONAL CONFERENCE ON ADVANCES IN PATTERN RECOGNITION (ICAPR), 2015, : 18 - +
  • [42] A rough ν-twin support vector regression machine
    Zhenxia Xue
    Roxin Zhang
    Chuandong Qin
    Xiaoqing Zeng
    Applied Intelligence, 2018, 48 : 4023 - 4046
  • [43] A regularization on Lagrangian twin support vector regression
    M. Tanveer
    K. Shubham
    International Journal of Machine Learning and Cybernetics, 2017, 8 : 807 - 821
  • [44] A regularization on Lagrangian twin support vector regression
    Tanveer, M.
    Shubham, K.
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2017, 8 (03) : 807 - 821
  • [45] Linear Programming Twin Support Vector Regression
    Tanveer, M.
    FILOMAT, 2017, 31 (07) : 2123 - 2142
  • [46] Twin least squares support vector regression
    Zhao, Yong-Ping
    Zhao, Jing
    Zhao, Min
    NEUROCOMPUTING, 2013, 118 : 225 - 236
  • [47] A rough ν-twin support vector regression machine
    Xue, Zhenxia
    Zhang, Roxin
    Qin, Chuandong
    Zeng, Xiaoqing
    APPLIED INTELLIGENCE, 2018, 48 (11) : 4023 - 4046
  • [48] Weighted Lagrange ε-twin support vector regression
    Ye, Ya-Fen
    Bai, Lan
    Hua, Xiang-Yu
    Shao, Yuan-Hai
    Wang, Zhen
    Deng, Nai-Yang
    NEUROCOMPUTING, 2016, 197 : 53 - 68
  • [49] A GA-based model selection for smooth twin parametric-margin support vector machine
    Wang, Zhen
    Shao, Yuan-Hai
    Wu, Tie-Ru
    PATTERN RECOGNITION, 2013, 46 (08) : 2267 - 2277
  • [50] Regression of survival data via twin support vector regression
    Ma, Guangzhi
    Zhao, Xuejing
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2022, 51 (09) : 5126 - 5138