Self-distillation and self-supervision for partial label learning

被引:6
|
作者
Yu, Xiaotong [1 ,3 ,4 ]
Sun, Shiding [1 ,3 ,4 ]
Tian, Yingjie [2 ,3 ,4 ,5 ]
机构
[1] Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
[2] Univ Chinese Acad Sci, Sch Econ & Management, Beijing 100190, Peoples R China
[3] Chinese Acad Sci, Res Ctr Fictitious Econ & Data Sci, Beijing 100190, Peoples R China
[4] Chinese Acad Sci, Key Lab Big Data Min & Knowledge Management, Beijing 100190, Peoples R China
[5] UCAS, MOE Social Sci Lab Digital Econ Forecasts & Polic, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Self-supervised learning; Partial label learning; Machine learning;
D O I
10.1016/j.patcog.2023.110016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a main branch of weakly supervised learning paradigm, partial label learning (PLL) copes with the situation where each sample corresponds to ambiguous candidate labels containing the unknown true label. The primary difficulty of PLL lies in label ambiguities, most existing researches focus on individual instance knowledge while ignore the importance of cross-sample knowledge. To circumvent this difficulty, an innovative multi-task framework is proposed in this work to integrate self-supervision and self-distillation to tackle PLL problem. Specifically, in the self-distillation task, cross-sample knowledge in the same batch is utilized to refine ensembled soft targets to supervise the distillation operation without using multiple networks. The auxiliary self-supervised task of recognizing rotation transformations of images provides more supervisory signal for feature learning. Overall, training supervision is constructed not only from the input data itself but also from other instances within the same batch. Empirical results on benchmark datasets reveal that this method is effective in learning from partially labeled data.
引用
下载
收藏
页数:11
相关论文
共 50 条
  • [21] Improving Spatiotemporal Self-supervision by Deep Reinforcement Learning
    Buechler, Uta
    Brattoli, Biagio
    Ommer, Bjoern
    COMPUTER VISION - ECCV 2018, PT 15, 2018, 11219 : 797 - 814
  • [22] Few-shot Learning with Online Self-Distillation
    Liu, Sihan
    Wang, Yue
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2021), 2021, : 1067 - 1070
  • [23] CoLES: Contrastive Learning for Event Sequences with Self-Supervision
    Babaev, Dmitrii
    Ovsov, Nikita
    Kireev, Ivan
    Ivanova, Maria
    Gusev, Gleb
    Nazarov, Ivan
    Tuzhilin, Alexander
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1190 - 1199
  • [24] Self-supervision, surveillance and transgression
    Simon, Gail
    JOURNAL OF FAMILY THERAPY, 2010, 32 (03) : 308 - 325
  • [25] Anomalies, representations, and self-supervision
    Dillon, Barry M.
    Favaro, Luigi
    Feiden, Friedrich
    Modak, Tanmoy
    Plehn, Tilman
    SCIPOST PHYSICS CORE, 2024, 7 (03):
  • [26] Probabilistic online self-distillation
    Tzelepi, Maria
    Passalis, Nikolaos
    Tefas, Anastasios
    NEUROCOMPUTING, 2022, 493 : 592 - 604
  • [27] Self-distillation improves self-supervised learning for DNA sequence inference
    Yu, Tong
    Cheng, Lei
    Khalitov, Ruslan
    Olsson, Erland B.
    Yang, Zhirong
    Neural Networks, 2025, 183
  • [28] Efficient One Pass Self-distillation with Zipf's Label Smoothing
    Liang, Jiajun
    Li, Linze
    Bing, Zhaodong
    Zhao, Borui
    Tang, Yao
    Lin, Bo
    Fan, Haoqiang
    COMPUTER VISION, ECCV 2022, PT XI, 2022, 13671 : 104 - 119
  • [29] Symmetries, safety, and self-supervision
    Dillon, Barry M.
    Kasieczka, Gregor
    Olischlaeger, Hans
    Plehn, Tilman
    Sorrenson, Peter
    Vogel, Lorenz
    SCIPOST PHYSICS, 2022, 12 (06):
  • [30] Self-Supervision: Psychodynamic Strategies
    Brenner, Ira
    JOURNAL OF THE AMERICAN PSYCHOANALYTIC ASSOCIATION, 2024, 72 (02)