Self-Knowledge Distillation from Target-Embedding AutoEncoder for Multi-Label Classification

被引:1
|
作者
Pan, Qizheng [1 ]
Yan, Ming [2 ]
Li, Guoqi [3 ]
Li, Jianmin [1 ]
Ma, Ying [4 ]
机构
[1] Xiamen Univ Technol, Coll Comp & Informat Engn, Xiamen, Peoples R China
[2] Agcy Sci Technol & Res, Inst High Performance Comp, Singapore, Singapore
[3] Chinese Acad Sci, Inst Automat, Beijing, Peoples R China
[4] Harbin Inst Technol, Fac Comp, Harbin, Peoples R China
来源
2022 IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG) | 2022年
基金
中国国家自然科学基金;
关键词
multi-label classification; target-embedding autoencoder; self-knowledge distillation;
D O I
10.1109/ICKG55886.2022.00034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Target-Embedding Autoencoder (TEA) has been successfully utilized in Multi-Label Classification (MLC), where each instance is associated with multiple labels. However, most existing TEA-based approaches mainly focus on the latent space alignment in their encoding phase, ignoring the output bias induced by overfitting in the training process. To address this issue, we provide a new approach named Self-Knowledge Distillation from TEA (SKDTEA) by removing the latent space alignment of TEA-based solutions with self-knowledge distillation in a simple yet effective manner. Unlike conventional self-knowledge distillation in multi-class learning, our SKDTEA leverages self-knowledge distillation by fully exploring the relationship between label smoothing and knowledge distillation. Specifically, an auxiliary module of SKDTEA is designed for ground-truth targets reconstruction, which outputs the recovered outputs as knowledge in a learned multi-label smoothing manner. The whole distillation process provides an efficient regularization to alleviate the overfitting issue in the training process. As far as we know, we are the first attempt to introduce the self-knowledge distillation into TEA-based approaches for the MLC. Experimental results demonstrate our proposed method achieves significant superiority over the well-established approaches in MLC.
引用
收藏
页码:210 / 216
页数:7
相关论文
共 50 条
  • [1] Target-Embedding Autoencoder With Knowledge Distillation for Multi-Label Classification
    Ma, Ying
    Zou, Xiaoyan
    Pan, Qizheng
    Yan, Ming
    Li, Guoqi
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (03): : 2506 - 2517
  • [2] Multi-Label Knowledge Distillation
    Yang, Penghui
    Xie, Ming-Kun
    Zong, Chen-Chen
    Feng, Lei
    Niu, Gang
    Sugiyama, Masashi
    Huang, Sheng-Jun
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17225 - 17234
  • [3] Group preserving label embedding for multi-label classification
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Kagita, Venkateswara Rao
    PATTERN RECOGNITION, 2019, 90 : 23 - 34
  • [4] FUNCTIONALLY SIMILAR MULTI-LABEL KNOWLEDGE DISTILLATION
    Chen, Binghan
    Hu, Jianlong
    Zheng, Xiawu
    Lin, Wei
    Chao, Fei
    Ji, Rongrong
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2024), 2024, : 7210 - 7214
  • [5] Multi-label classification using hierarchical embedding
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Sahu, Sandeep Kumar
    Kagita, Venkateswara Rao
    EXPERT SYSTEMS WITH APPLICATIONS, 2018, 91 : 263 - 269
  • [6] Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection
    Liu, Yongcheng
    Sheng, Lu
    Shao, Jing
    Yan, Junjie
    Xiang, Shiming
    Pan, Chunhong
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 700 - 708
  • [7] Representation Learning With Dual Autoencoder for Multi-Label Classification
    Zhu, Yi
    Yang, Yang
    Li, Yun
    Qiang, Jipeng
    Yuan, Yunhao
    Zhang, Runmei
    IEEE ACCESS, 2021, 9 : 98939 - 98947
  • [8] Label Embedding for Multi-label Classification Via Dependence Maximization
    Yachong Li
    Youlong Yang
    Neural Processing Letters, 2020, 52 : 1651 - 1674
  • [9] Cost-sensitive label embedding for multi-label classification
    Kuan-Hao Huang
    Hsuan-Tien Lin
    Machine Learning, 2017, 106 : 1725 - 1746
  • [10] Label Embedding for Multi-label Classification Via Dependence Maximization
    Li, Yachong
    Yang, Youlong
    NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1651 - 1674