Large-Scale Least Squares Twin SVMs

被引:10
|
作者
Tanveer, M. [1 ]
Sharma, S. [1 ]
Muhammad, K. [2 ]
机构
[1] Indian Inst Technol Indore, Dept Math, Indore 453552, Madhya Pradesh, India
[2] Sejong Univ, Dept Software, Seoul 143747, South Korea
关键词
Machine learning; support vector machines (SVMs); large scale SVMs; least squares twin SVM; SUPPORT VECTOR MACHINES; SMO ALGORITHM; CLASSIFICATION; IMPROVEMENTS; CLASSIFIERS; ROBUST;
D O I
10.1145/3398379
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the last decade, twin support vectormachine (TWSVM) classifiers have achieved considerable emphasis on pattern classification tasks. However, the TWSVM formulation still suffers from the following two shortcomings: (1) TWSVM deals with the inverse matrix calculation in the Wolfe-dual problems, which is intractable for large-scale datasets with numerous features and samples, and (2) TWSVM minimizes the empirical risk instead of the structural risk in its formulation. With the advent of huge amounts of data today, these disadvantages render TWSVM an ineffective choice for pattern classification tasks. In this article, we propose an efficient large-scale least squares twin support vector machine (LS-LSTSVM) for pattern classification that rectifies all the aforementioned shortcomings. The proposed LS-LSTSVM introduces different Lagrangian functions to eliminate the need for calculating inverse matrices. The proposed LS-LSTSVM also does not employ kernel-generated surfaces for the non-linear case, and thus uses the kernel trick directly. This ensures that the proposed LS-LSTSVM model is superior to the original TWSVM and LSTSVM. Lastly, the structural risk is minimized in LS-LSTSVM. This exhibits the essence of statistical learning theory, and consequently, classification accuracy on datasets can be improved due to this change. The proposed LS-LSTSVM is solved using the sequential minimal optimization (SMO) technique, making it more suitable for large-scale problems. We further proved the convergence of the proposed LS-LSTSVM. Exhaustive experiments on several real-world benchmarks and NDC-based large-scale datasets demonstrate that the proposed LS-LSTSVM is feasible for large datasets and, in most cases, performed better than existing algorithms.
引用
收藏
页数:19
相关论文
共 50 条
  • [1] Large-Scale Fuzzy Least Squares Twin SVMs for Class Imbalance Learning
    Ganaie, M. A.
    Tanveer, M.
    Lin, Chin-Teng
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2022, 30 (11) : 4815 - 4827
  • [2] Intuitionistic Fuzzy Weighted Least Squares Twin SVMs
    Tanveer, M.
    Ganaie, M. A.
    Bhattacharjee, A.
    Lin, C. T.
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (07) : 4400 - 4409
  • [3] Polycentric intuitionistic fuzzy weighted least squares twin SVMs
    Liu, Liang
    Li, Shuaiyong
    Zhang, Xu
    Dai, Zhengxu
    Zhu, Yongqiang
    [J]. NEUROCOMPUTING, 2024, 609
  • [4] LARGE-SCALE DUAL REGULARIZED TOTAL LEAST SQUARES
    Lampe, Joerg
    Voss, Heinrich
    [J]. ELECTRONIC TRANSACTIONS ON NUMERICAL ANALYSIS, 2014, 42 : 13 - 40
  • [5] ON LARGE-SCALE NONLINEAR LEAST-SQUARES CALCULATIONS
    TOINT, PL
    [J]. SIAM JOURNAL ON SCIENTIFIC AND STATISTICAL COMPUTING, 1987, 8 (03): : 416 - 435
  • [6] Large-scale Tikhonov regularization of total least squares
    Lampe, Joerg
    Voss, Heinrich
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2012, 238 : 95 - 108
  • [7] On solving large-scale weighted least squares problems
    Baryamureeba, V
    [J]. NUMERICAL ANALYSIS AND ITS APPLICATIONS, 2001, 1988 : 59 - 67
  • [8] Model reduction of large-scale systems by least squares
    Gugercin, Serkan
    Antoulas, Athanasios C.
    [J]. LINEAR ALGEBRA AND ITS APPLICATIONS, 2006, 415 (2-3) : 290 - 321
  • [9] Core-elements for large-scale least squares estimation
    Li, Mengyu
    Yu, Jun
    Li, Tao
    Meng, Cheng
    [J]. STATISTICS AND COMPUTING, 2024, 34 (06)
  • [10] A Universal Analysis of Large-Scale Regularized Least Squares Solutions
    Panahi, Ashkan
    Hassibi, Babak
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30