A Self-Paced Regularization Framework for Partial-Label Learning

被引:29
|
作者
Lyu, Gengyu [1 ,2 ]
Feng, Songhe [1 ,2 ]
Wang, Tao [1 ,2 ]
Lang, Congyan [1 ,2 ]
机构
[1] Beijing Jiaotong Univ, Beijing Key Lab Traff Data Anal & Min, Beijing 100044, Peoples R China
[2] Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing 100044, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
Phase locked loops; Training; Optimization; Complexity theory; Training data; Silicon; Disambiguation; maximum margin; partial-label learning (PLL); self-paced regime;
D O I
10.1109/TCYB.2020.2990908
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Partial-label learning (PLL) aims to solve the problem where each training instance is associated with a set of candidate labels, one of which is the correct label. Most PLL algorithms try to disambiguate the candidate label set, by either simply treating each candidate label equally or iteratively identifying the true label. Nonetheless, existing algorithms usually treat all labels and instances equally, and the complexities of both labels and instances are not taken into consideration during the learning stage. Inspired by the successful application of a self-paced learning strategy in the machine-learning field, we integrate the self-paced regime into the PLL framework and propose a novel self-paced PLL (SP-PLL) algorithm, which could control the learning process to alleviate the problem by ranking the priorities of the training examples together with their candidate labels during each learning iteration. Extensive experiments and comparisons with other baseline methods demonstrate the effectiveness and robustness of the proposed method.
引用
收藏
页码:899 / 911
页数:13
相关论文
共 50 条
  • [31] SELF-PACED LEARNING IN CIVIL ET
    SHARPLES, K
    ENGINEERING EDUCATION, 1977, 67 (08): : 797 - 798
  • [32] SELF-PACED LEARNING AND STUDENT MOTIVATION
    MCCOLLOM, KA
    ENGINEERING EDUCATION, 1974, 64 (06): : 427 - 429
  • [33] Towards Effective Visual Representations for Partial-Label Learning
    Xia, Shiyu
    Lv, Jiaqi
    Xu, Ning
    Niu, Gang
    Geng, Xin
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 15589 - 15598
  • [34] A bi-level metric learning framework via self-paced learning weighting
    Yan, Jing
    Wei, Wei
    Guo, Xinyao
    Dang, Chuangyin
    Liang, Jiye
    PATTERN RECOGNITION, 2023, 139
  • [35] Adaptive graph nonnegative matrix factorization with the self-paced regularization
    Xuanhao Yang
    Hangjun Che
    Man-Fai Leung
    Cheng Liu
    Applied Intelligence, 2023, 53 : 15818 - 15835
  • [36] Can Label-Specific Features Help Partial-Label Learning?
    Dong, Ruo-Jing
    Hang, Jun-Yi
    Wei, Tong
    Zhang, Min-Ling
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7432 - 7440
  • [37] SelectNet: Self-paced learning for high-dimensional partial differential equations
    Gu, Yiqi
    Yang, Haizhao
    Zhou, Chao
    JOURNAL OF COMPUTATIONAL PHYSICS, 2021, 441
  • [38] Adaptive graph nonnegative matrix factorization with the self-paced regularization
    Yang, Xuanhao
    Che, Hangjun
    Leung, Man-Fai
    Liu, Cheng
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15818 - 15835
  • [39] Cost-Sensitive Self-Paced Learning With Adaptive Regularization for Classification of Image Time Series
    Li, Hao
    Li, Jianzhao
    Zhao, Yue
    Gong, Maoguo
    Zhang, Yujing
    Liu, Tongfei
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 11713 - 11727
  • [40] The Efficacy of Self-Paced Study in Multitrial Learning
    de Jonge, Mario
    Tabbers, Huib K.
    Pecher, Diane
    Jang, Yoonhee
    Zeelenberg, Rene
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-LEARNING MEMORY AND COGNITION, 2015, 41 (03) : 851 - 858