Self-distillation and self-supervision for partial label learning

被引:6
|
作者
Yu, Xiaotong [1 ,3 ,4 ]
Sun, Shiding [1 ,3 ,4 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] UCAS, MOE Social Sci Lab Digital Econ Forecasts & Polic, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Self-supervised learning; Partial label learning; Machine learning;
D O I
10.1016/j.patcog.2023.110016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [1] Fair Visual Recognition in Limited Data Regime using Self-Supervision and Self-Distillation
    Mazumder, Pratik
    Singh, Pravendra
    Namboodiri, Vinay P.
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 3889 - 3897
  • [2] Learning from Better Supervision: Self-distillation for Learning with Noisy Labels
    Baek, Kyungjune
    Lee, Seungho
    Shim, Hyunjung
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1829 - 1835
  • [3] Self-Supervision and Self-Distillation with Multilayer Feature Contrast for Supervision Collapse in Few-Shot Remote Sensing Scene Classification
    Zhou, Haonan
    Du, Xiaoping
    Li, Sen
    REMOTE SENSING, 2022, 14 (13)
  • [4] Understanding Self-Distillation in the Presence of Label Noise
    Das, Rudrajit
    Sanghavi, Sujay
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [5] Reverse Self-Distillation Overcoming the Self-Distillation Barrier
    Ni, Shuiping
    Ma, Xinliang
    Zhu, Mingfu
    Li, Xingwang
    Zhang, Yu-Dong
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2023, 4 : 195 - 205
  • [6] Tailoring Self-Supervision for Supervised Learning
    Moon, WonJun
    Kim, Ji-Hwan
    Heo, Jae-Pil
    COMPUTER VISION, ECCV 2022, PT XXV, 2022, 13685 : 346 - 364
  • [7] Learning with self-supervision on EEG data
    Gramfort, Alexandre
    Banville, Hubert
    Chehab, Omar
    Hyvarinen, Aapo
    Engemann, Denis
    2021 9TH IEEE INTERNATIONAL WINTER CONFERENCE ON BRAIN-COMPUTER INTERFACE (BCI), 2021, : 28 - 29
  • [8] Complementary Calibration: Boosting General Continual Learning With Collaborative Distillation and Self-Supervision
    Ji, Zhong
    Li, Jin
    Wang, Qiang
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 657 - 667
  • [9] Tackling Partial Domain Adaptation with Self-supervision
    Bucci, Silvia
    D'Innocente, Antonio
    Tommasi, Tatiana
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2019, PT II, 2019, 11752 : 70 - 81
  • [10] Self-Distillation as Instance-Specific Label Smoothing
    Zhang, Zhilu
    Sabuncu, Mert R.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33