Deep Additive Least Squares Support Vector Machines for Classification With Model Transfer

被引:0
|
作者
Wang, Guanjin [1 ,2 ]
Zhang, Guangquan [1 ]
Choi, Kup-Sze [2 ]
Lu, Jie [1 ]
机构
[1] Univ Technol Sydney, Sch Software, Fac Engn & Informat Technol, Ctr Artificial Intelligence, Broadway, NSW 2007, Australia
[2] Hong Kong Polytech Univ, Sch Nursing, Ctr Smart Hlth, Hong Kong, Peoples R China
基金
澳大利亚研究理事会;
关键词
Classification; deep architectures; support vector machine (SVM); transfer learning; STATISTICAL COMPARISONS; REGRESSION APPROACH; MULTIPLE; CLASSIFIERS; VALIDATION;
D O I
10.1109/TSMC.2017.2759090
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The additive kernel least squares support vector machine (AK-LS-SVM) has been well used in classification tasks due to its inherent advantages. For example, additive kernels work extremely well for some specific tasks, such as computer vision classification, medical research, and some specialized scenarios. Moreover, the analytical solution using AK-LS-SVM can formulate leave-one-out cross-validation error estimates in a closed form for parameter tuning, which drastically reduces the computational cost and guarantee the generalization performance especially on small and medium datasets. However, AK-LS-SVM still faces two main challenges: 1) improving the classification performance of AK-LS-SVM and 2) saving time when performing a grid search for model selection. Inspired by the stacked generalization principle and the transfer learning mechanism, a layer-by-layer combination of AK-LS-SVM classifiers embedded with transfer learning is proposed in this paper. This new classifier is called deep transfer additive kernel least square support vector machine (DTA-LS-SVM) which overcomes these two challenges. Also, considering that imbalanced datasets are involved in many real-world scenarios, especially for medical data analysis, the deep-transfer element is extended to compensate for this imbalance, thus leading to the development of another new classifier iDTA-IS-SVM. In the hierarchical structure of both DTA-LS-SVM and iDTA-LS-SVM, each layer has an AK-LS-SVM and the predictions from the previous layer act as an additional input feature for the current layer. Importantly, transfer learning is also embedded to guarantee generalization consistency between the adjacent layers. Moreover, both iDTA-LS-SVM and DTA-LS-SVM can ensure the minimal leaveone-out error by using the proposed fast leave-one-out cross validation strategy on the training set in each layer. We compared the proposed classifiers DTA-LS-SVM and iDTA-LS-SVM with the traditional LS-SVM and SVM using additive kernels on seven public UCI datasets and one real world dataset. The experimental results show that both DTA-Ls-SVM and iDTA-LS-SVM exhibit better generalization performance and faster learning speed.
引用
收藏
页码:1527 / 1540
页数:14
相关论文
共 50 条
  • [1] Deep Additive Least Squares Support Vector Machines for Classification with Model Transfer
    Wang, Guanjin
    Zhang, Guangquan
    Choi, Kup-Sze
    Lu, Jie
    [J]. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 2019, 49 (07) : 1527 - 1540
  • [2] Additive survival least-squares support vector machines
    Van Belle, V.
    Pelckmans, K.
    Suykens, J. A. K.
    Van Huffel, S.
    [J]. STATISTICS IN MEDICINE, 2010, 29 (02) : 296 - 308
  • [3] Least squares twin support vector machines for pattern classification
    Kumar, M. Arun
    Gopal, M.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (04) : 7535 - 7543
  • [4] Multi-View Least Squares Support Vector Machines Classification
    Houthuys, Lynn
    Langone, Rocco
    Suykens, Johan A. K.
    [J]. NEUROCOMPUTING, 2018, 282 : 78 - 88
  • [5] Subspace Based Least Squares Support Vector Machines for Pattern Classification
    Kitamura, Takuya
    Abe, Shigeo
    Fukui, Kazuhiro
    [J]. IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1275 - +
  • [6] Efficient sparse least squares support vector machines for pattern classification
    Tian, Yingjie
    Ju, Xuchan
    Qi, Zhiquan
    Shi, Yong
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2013, 66 (10) : 1935 - 1947
  • [7] A kind of fuzzy least squares support vector machines for pattern classification
    Chen, SW
    Xu, Y
    [J]. APPLIED COMPUTATIONAL INTELLIGENCE, 2004, : 308 - 313
  • [8] Weighted Least Squares Twin Support Vector Machines for Pattern Classification
    Chen, Jing
    Ji, Guangrong
    [J]. 2010 2ND INTERNATIONAL CONFERENCE ON COMPUTER AND AUTOMATION ENGINEERING (ICCAE 2010), VOL 2, 2010, : 242 - 246
  • [9] Fuzzy least squares support vector machines
    Tsujinishi, D
    Abe, S
    [J]. PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 1599 - 1604
  • [10] Digital Least Squares Support Vector Machines
    Davide Anguita
    Andrea Boni
    [J]. Neural Processing Letters, 2003, 18 : 65 - 72