Self-distillation and self-supervision for partial label learning

被引:6
|
作者
Yu, Xiaotong [1 ,3 ,4 ]
Sun, Shiding [1 ,3 ,4 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] UCAS, MOE Social Sci Lab Digital Econ Forecasts & Polic, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Self-supervised learning; Partial label learning; Machine learning;
D O I
10.1016/j.patcog.2023.110016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [11] THE FEASIBILITY OF SELF-SUPERVISION
    Hudelson, Earl
    JOURNAL OF EDUCATIONAL RESEARCH, 1952, 45 (05): : 335 - 347
  • [12] Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
    Yang, Chuanguang
    An, Zhulin
    Cai, Linhang
    Xu, Yongjun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2094 - 2108
  • [13] Learning to Remove Rain in Video With Self-Supervision
    Yang, Wenhan
    Tan, Robby T.
    Wang, Shiqi
    Kot, Alex C.
    Liu, Jiaying
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1378 - 1396
  • [14] Prototype Augmentation and Self-Supervision for Incremental Learning
    Zhu, Fei
    Zhang, Xu-Yao
    Wang, Chuang
    Yin, Fei
    Liu, Cheng-Lin
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 5867 - 5876
  • [15] Deep Contrastive Representation Learning With Self-Distillation
    Xiao, Zhiwen
    Xing, Huanlai
    Zhao, Bowen
    Qu, Rong
    Luo, Shouxi
    Dai, Penglin
    Li, Ke
    Zhu, Zonghai
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (01): : 3 - 15
  • [16] LGD: Label-Guided Self-Distillation for Object Detection
    Zhang, Peizhen
    Kang, Zijian
    Yang, Tong
    Zhang, Xiangyu
    Zheng, Nanning
    Sun, Jian
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3309 - 3317
  • [17] Stochastic Ghost Batch for Self-distillation with Dynamic Soft Label
    Li, Qian
    Hu, Qingyuan
    Qi, Saiyu
    Qi, Yong
    Wu, Di
    Lin, Yun
    Dong, Jin Song
    KNOWLEDGE-BASED SYSTEMS, 2022, 241
  • [18] Semantic alignment with self-supervision for class incremental learning
    Fu, Zhiling
    Wang, Zhe
    Xu, Xinlei
    Yang, Mengping
    Chi, Ziqiu
    Ding, Weichao
    KNOWLEDGE-BASED SYSTEMS, 2023, 282
  • [19] Improving Audio Classification Method by Combining Self-Supervision with Knowledge Distillation
    Gong, Xuchao
    Duan, Hongjie
    Yang, Yaozhong
    Tan, Lizhuang
    Wang, Jian
    Vasilakos, Athanasios V.
    ELECTRONICS, 2024, 13 (01)
  • [20] SIMPLE SELF-DISTILLATION LEARNING FOR NOISY IMAGE CLASSIFICATION
    Sasaya, Tenta
    Watanabe, Takashi
    Ida, Takashi
    Ono, Toshiyuki
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 795 - 799