Solving large-scale support vector ordinal regression with asynchronous parallel coordinate descent algorithms

被引:8
|
作者
Gu, Bin [1 ,2 ]
Geng, Xiang [1 ]
Shi, Wanli [1 ]
Shan, Yingying [1 ]
Huang, Yufang [3 ]
Wang, Zhijie [4 ]
Zheng, Guansheng [1 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Comp & Software, Nanjing, Peoples R China
[2] JD Finance Amer Corp, East Hampton, NY USA
[3] eBay Inc, Shanghai, Peoples R China
[4] Invis AI, Toronto, ON, Canada
关键词
Asynchronous parallel; Coordinate descent; Support vector; Ordinal regression; KERNEL; MACHINE;
D O I
10.1016/j.patcog.2020.107592
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ordinal regression is one of the most influential tasks of supervised learning. Support vector ordinal regression (SVOR) is an appealing method to tackle ordinal regression problems. However, due to the complexity in the formulation of SVOR and the high cost of kernel computation, traditional SVOR solvers are inefficient for large-scale training. To address this problem, in this paper, we first highlight a special SVOR formulation whose thresholds are described implicitly, so that the dual formulation is concise to apply the state-of-the-art asynchronous parallel coordinate descent algorithm, such as AsyGCD. To further accelerate the training for SVOR, we propose two novel asynchronous parallel coordinate descent algorithms, called AsyACGD and AsyORGCD respectively. AsyACGD is an accelerated extension of AsyGCD using active set strategy. AsyORGCD is specifically designed for SVOR that it can keep the ordered thresholds when it is training so that it can obtain good performance with lower time. Experimental results on several large-scale ordinal regression datasets demonstrate the superiority of our proposed algorithms. (C) 2020 Elsevier Ltd. All rights reserved.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Large-Scale Linear Support Vector Ordinal Regression Solver
    Shi, Yong
    Wang, Huadong
    Niu, Lingfeng
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOP (ICDMW), 2015, : 1177 - 1183
  • [2] Large-scale Nonparallel Support Vector Ordinal Regression Solver
    Wang, Huadong
    Miao, Jianyu
    Bamakan, Seyed Mojtaba Hosseini
    Niu, Lingfeng
    Shi, Yong
    [J]. INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS 2017), 2017, 108 : 1261 - 1270
  • [3] Coordinate descent algorithms for large-scale SVDD
    [J]. Tao, Q. (taoqing@gmail.com), 1600, Science Press (25):
  • [4] Large-scale support vector regression with budgeted stochastic gradient descent
    Xie, Zongxia
    Li, Yingda
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (06) : 1529 - 1541
  • [5] Large-scale support vector regression with budgeted stochastic gradient descent
    Zongxia Xie
    Yingda Li
    [J]. International Journal of Machine Learning and Cybernetics, 2019, 10 : 1529 - 1541
  • [6] Asynchronous Parallel Large-Scale Gaussian Process Regression
    Dang, Zhiyuan
    Gu, Bin
    Deng, Cheng
    Huang, Heng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 8683 - 8694
  • [7] Large-scale Linear Support Vector Regression
    Ho, Chia-Hua
    Lin, Chih-Jen
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 3323 - 3348
  • [8] L2-loss Large-scale Linear Nonparallel Support Vector Ordinal Regression
    Shi Y.
    Li P.-J.
    Wang H.-D.
    [J]. Zidonghua Xuebao/Acta Automatica Sinica, 2019, 45 (03): : 505 - 517
  • [9] Block coordinate descent algorithms for large-scale sparse multiclass classification
    Blondel, Mathieu
    Seki, Kazuhiro
    Uehara, Kuniaki
    [J]. MACHINE LEARNING, 2013, 93 (01) : 31 - 52
  • [10] Block coordinate descent algorithms for large-scale sparse multiclass classification
    Mathieu Blondel
    Kazuhiro Seki
    Kuniaki Uehara
    [J]. Machine Learning, 2013, 93 : 31 - 52