Knowledge Distillation Meets Open-Set Semi-supervised Learning

被引:0
|
作者
Yang, Jing [1 ]
Zhu, Xiatian [2 ,3 ]
Bulat, Adrian [2 ]
Martinez, Brais [2 ]
Tzimiropoulos, Georgios [2 ,4 ]
机构
[1] Univ Nottingham, Nottingham, England
[2] Samsung AI Ctr, Cambridge, England
[3] Univ Surrey, Guildford, England
[4] Queen Mary Univ London, London, England
关键词
Knowledge distillation; Structured representational knowledge; Open-set semi-supervised learning; Out-of-distribution;
D O I
10.1007/s11263-024-02192-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing knowledge distillation methods mostly focus on distillation of teacher's prediction and intermediate activation. However, the structured representation, which arguably is one of the most critical ingredients of deep models, is largely overlooked. In this work, we propose a novel semantic representational distillation (SRD) method dedicated for distilling representational knowledge semantically from a pretrained teacher to a target student. The key idea is that we leverage the teacher's classifier as a semantic critic for evaluating the representations of both teacher and student and distilling the semantic knowledge with high-order structured information over all feature dimensions. This is accomplished by introducing a notion of cross-network logit computed through passing student's representation into teacher's classifier. Further, considering the set of seen classes as a basis for the semantic space in a combinatorial perspective, we scale SRD to unseen classes for enabling effective exploitation of largely available, arbitrary unlabeled training data. At the problem level, this establishes an interesting connection between knowledge distillation with open-set semi-supervised learning (SSL). Extensive experiments show that our SRD outperforms significantly previous state-of-the-art knowledge distillation methods on both coarse object classification and fine face recognition tasks, as well as less studied yet practically crucial binary network distillation. Under more realistic open-set SSL settings we introduce, we reveal that knowledge distillation is generally more effective than existing out-of-distribution sample detection, and our proposed SRD is superior over both previous distillation and SSL competitors. The source code is available at https://github.com/jingyang2017/SRD_ossl.
引用
收藏
页码:315 / 334
页数:20
相关论文
共 50 条
  • [21] Knowledge Distillation for Semi-supervised Domain Adaptation
    Orbes-Arteainst, Mauricio
    Cardoso, Jorge
    Sorensen, Lauge
    Igel, Christian
    Ourselin, Sebastien
    Modat, Marc
    Nielsen, Mads
    Pai, Akshay
    OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
  • [22] Open-set Recognition with Supervised Contrastive Learning
    Kodama, Yuto
    Wang, Yinan
    Kawakami, Rei
    Naemura, Takeshi
    PROCEEDINGS OF 17TH INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS (MVA 2021), 2021,
  • [23] When Semi-supervised Learning Meets Ensemble Learning
    Zhou, Zhi-Hua
    MULTIPLE CLASSIFIER SYSTEMS, PROCEEDINGS, 2009, 5519 : 529 - 538
  • [24] A Semi-Supervised Federated Learning Scheme via Knowledge Distillation for Intrusion Detection
    Zhao, Ruijie
    Yang, Linbo
    Wang, Yijun
    Xue, Zhi
    Gui, Guan
    Ohtsukit, Tomoaki
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2688 - 2693
  • [25] Collaborative deep semi-supervised learning with knowledge distillation for surface defect classification
    Manivannan, Siyamalan
    COMPUTERS & INDUSTRIAL ENGINEERING, 2023, 186
  • [26] Semi-Supervised Image Deraining Using Knowledge Distillation
    Cui, Xin
    Wang, Cong
    Ren, Dongwei
    Chen, Yunjin
    Zhu, Pengfei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8327 - 8341
  • [27] Semi-supervised Semantic Segmentation with Mutual Knowledge Distillation
    Yuan, Jianlong
    Ge, Jinchao
    Wang, Zhibin
    Liu, Yifan
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5436 - 5444
  • [28] Semi-supervised learning in knowledge discovery
    Klose, A
    Kruse, R
    FUZZY SETS AND SYSTEMS, 2005, 149 (01) : 209 - 233
  • [29] Semi-supervised learning on closed set lattices
    Sugiyama, Mahito
    Yamamoto, Akihiro
    INTELLIGENT DATA ANALYSIS, 2013, 17 (03) : 399 - 421
  • [30] Semi-Supervised Blind Image Quality Assessment through Knowledge Distillation and Incremental Learning
    Pan, Wensheng
    Gao, Timin
    Zhang, Yan
    Zheng, Xiawu
    Shen, Yunhang
    Li, Ke
    Hu, Runze
    Liu, Yutao
    Dai, Pingyang
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4388 - 4396