ONLINE MULTI-LABEL LEARNING WITH ACCELERATED NONSMOOTH STOCHASTIC GRADIENT DESCENT

被引:0
|
作者
Park, Sunho [1 ]
Choi, Seungjin [1 ]
机构
[1] POSTECH, Dept Comp Sci & Engn, Pohang, South Korea
关键词
Label ranking; multi-label learning; Nesterov's method; nonsmooth minimization; stochastic gradient descent; ALGORITHMS;
D O I
暂无
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Multi-label learning refers to methods for learning a set of functions that assigns a set of relevant labels to each instance. One of popular approaches to multi-label learning is label ranking, where a set of ranking functions are learned to order all the labels such that relevant labels are ranked higher than irrelevant ones. Rank-SVM is a representative method for label ranking where ranking loss is minimized in the framework of max margin. However, the dual form in Rank-SVM involves a quadratic programming which is generally solved in cubic time in the size of training data. The primal form is appealing for the development of online learning but involves a nonsmooth convex loss function. In this paper we present a method for online multi-label learning where we minimize the primal form using the accelerated nonsmooth stochastic gradient descent which has been recently developed to extend Nesterov's smoothing method to the stochastic setting. Numerical experiments on several large-scale datasets demonstrate the computational efficiency and fast convergence of our proposed method, compared to existing methods including subgradient-based algorithms.
引用
收藏
页码:3322 / 3326
页数:5
相关论文
共 50 条
  • [1] Online Metric Learning for Multi-Label Classification
    Gong, Xiuwen
    Yuan, Dong
    Bao, Wei
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4012 - 4019
  • [2] Learning Gradient Boosted Multi-label Classification Rules
    Rapp, Michael
    Mencia, Eneldo Loza
    Fuernkranz, Johannes
    Nguyen, Vu-Linh
    Huellermeier, Eyke
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT III, 2021, 12459 : 124 - 140
  • [3] Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning
    Yang, Zhenhuan
    Lei, Yunwen
    Wang, Puyu
    Yang, Tianbao
    Ying, Yiming
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Multi-label Disengagement and Behavior Prediction in Online Learning
    Verma, Manisha
    Nakashima, Yuta
    Takemura, Noriko
    Nagahara, Hajime
    [J]. ARTIFICIAL INTELLIGENCE IN EDUCATION, PT I, 2022, 13355 : 633 - 639
  • [5] Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses
    Bassily, Raef
    Feldman, Vitaly
    Guzman, Cristobal
    Talwar, Kunal
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Accelerated Randomized Coordinate Descent Algorithms for Stochastic Optimization and Online Learning
    Bhandari, Akshita
    Singh, Chandramani
    [J]. LEARNING AND INTELLIGENT OPTIMIZATION, LION 12, 2019, 11353 : 1 - 15
  • [7] LEARNING BY ONLINE GRADIENT DESCENT
    BIEHL, M
    SCHWARZE, H
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1995, 28 (03): : 643 - 656
  • [8] Conditional Accelerated Lazy Stochastic Gradient Descent
    Lan, Guanghui
    Pokutta, Sebastian
    Zhou, Yi
    Zink, Daniel
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [9] Asynchronous Decentralized Accelerated Stochastic Gradient Descent
    Lan, Guanghui
    Zhou, Yi
    [J]. Zhou, Yi (yi.zhou@ibm.com), 1600, Institute of Electrical and Electronics Engineers Inc. (02): : 802 - 811
  • [10] Learning Label Correlations for Multi-Label Online Passive Aggressive Classification Algorithm
    ZHANG Yongwei
    [J]. Wuhan University Journal of Natural Sciences, 2024, 29 (01) : 51 - 58