The aLS-SVM based multi-task learning classifiers

被引:0
|
作者
Liyun Lu
Qiang Lin
Huimin Pei
Ping Zhong
机构
[1] China Agricultural University,College of Science
来源
Applied Intelligence | 2018年 / 48卷
关键词
Multi-task learning; Support vector machine; Asymmetric least squared loss;
D O I
暂无
中图分类号
学科分类号
摘要
The multi-task learning support vector machines (SVMs) have recently attracted considerable attention since the conventional single task learning ones usually ignore the relatedness among multiple related tasks and train them separately. Different from the single task learning, the multi-task learning methods can capture the correlation among tasks and achieve an improved performance by training all tasks simultaneously. In this paper, we make two assumptions on the relatedness among tasks. One is that the normal vectors of the related tasks share a certain common parameter value; the other is that the models of the related tasks are close enough and share a common model. Under these assumptions, we propose two multi-task learning methods, named as MTL-aLS-SVM I and MTL-aLS-SVM II respectively, for binary classification by taking full advantages of multi-task learning and the asymmetric least squared loss. MTL-aLS-SVM I seeks for a trade-off between the maximal expectile distance for each task model and the closeness of each task model to the averaged model. MTL-aLS-SVM II can use different kernel functions for different tasks, and it is an extension of the MTL-aLS-SVM I. Both of them can be easily implemented by solving quadratic programming. In addition, we develop their special cases which include L2-SVM based multi-task learning methods (MTL-L2-SVM I and MTL-L2-SVM II) and the least squares SVM (LS-SVM) based multi-task learning methods (MTL-LS-SVM I and MTL-LS-SVM II). Although the MTL-L2-SVM II and MTL-LS-SVM II appear in the form of special cases, they are firstly proposed in this paper. The experimental results show that the proposed methods are very encouraging.
引用
收藏
页码:2393 / 2407
页数:14
相关论文
共 50 条
  • [1] The aLS-SVM based multi-task learning classifiers
    Lu, Liyun
    Lin, Qiang
    Pei, Huimin
    Zhong, Ping
    [J]. APPLIED INTELLIGENCE, 2018, 48 (08) : 2393 - 2407
  • [2] A Convex Formulation of SVM-Based Multi-task Learning
    Ruiz, Carlos
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    [J]. HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2019, 2019, 11734 : 404 - 415
  • [3] SVM plus Regression and Multi-Task Learning
    Cai, Feng
    Cherkassky, Vladimir
    [J]. IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 503 - 509
  • [4] Implementation and Comparison of SVM-Based Multi-Task Learning Methods
    Shiao, Han-Tai
    Cherkassky, Vladimir
    [J]. 2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [5] Multi-task learning with one-class SVM
    He, Xiyan
    Mourot, Gilles
    Maquin, Didier
    Ragot, Jose
    Beauseroy, Pierre
    Smolarz, Andre
    Grall-Maes, Edith
    [J]. NEUROCOMPUTING, 2014, 133 : 416 - 426
  • [6] Connection Between SVM plus and Multi-Task Learning
    Liang, Lichen
    Cherkassky, Vladimir
    [J]. 2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 2048 - 2054
  • [7] One-class SVM in multi-task learning
    He, Xiyan
    Mourot, Gilles
    Maquin, Didier
    Ragot, Jose
    Beauseroy, Pierre
    Smolarz, Andre
    Grall-Maes, Edith
    [J]. ADVANCES IN SAFETY, RELIABILITY AND RISK MANAGEMENT, 2012, : 486 - 494
  • [8] Convex Graph Laplacian Multi-Task Learning SVM
    Ruiz, Carlos
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 142 - 154
  • [9] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369
  • [10] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    [J]. MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369