Image classification based on self-distillation

被引:2
|
作者
Li, Yuting [1 ]
Qing, Linbo [1 ]
He, Xiaohai [1 ]
Chen, Honggang [1 ]
Liu, Qiang [1 ]
机构
[1] Sichuan Univ, Coll Elect & Informat Engn, 24 South Sect 1,Yihuan Rd, Chengdu 610065, Peoples R China
基金
中国国家自然科学基金;
关键词
Image classification; Self-distillation; Attention; FUSION;
D O I
10.1007/s10489-022-04008-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks have been widely used in various application scenarios. To extend the application to some areas where accuracy is critical, researchers have been investigating methods to improve accuracy using deeper or broader network structures, which creates exponential growth in computation and storage costs and delays in response time. In this paper, we propose a self-distillation image classification algorithm that significantly improves performance while decreasing training costs. In traditional self-distillation, the student model needs to improve its ability to acquire global information and focus on key features due to the lack of guidance from the teacher model. For this reason, we improved the traditional self-distillation algorithm by using a positional attention module and a residual block with attention. Experimental results show that the method achieves better performance compared with traditional knowledge distillation methods and attention networks.
引用
收藏
页码:9396 / 9408
页数:13
相关论文
共 50 条
  • [41] CLASS-AWARE REGULARIZED SELF-DISTILLATION LEARNING METHOD FOR LAND COVER CLASSIFICATION
    Zang, Qi
    2022 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2022), 2022, : 4603 - 4606
  • [42] Retinal vessel segmentation based on self-distillation and implicit neural representation
    Jia Gu
    Fangzheng Tian
    Il-Seok Oh
    Applied Intelligence, 2023, 53 : 15027 - 15044
  • [43] Embedded Self-Distillation in Compact Multibranch Ensemble Network for Remote Sensing Scene Classification
    Zhao, Qi
    Ma, Yujing
    Lyu, Shuchang
    Chen, Lijiang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [44] Towards Elastic Image Super-Resolution Network via Progressive Self-distillation
    Yu, Xin'an
    Zhang, Dongyang
    Liu, Cencen
    Dong, Qiang
    Duan, Guiduo
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT VIII, 2025, 15038 : 137 - 150
  • [45] Understanding Self-Distillation in the Presence of Label Noise
    Das, Rudrajit
    Sanghavi, Sujay
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [46] Towards Compact Single Image Super-Resolution via Contrastive Self-distillation
    Wang, Yanbo
    Lin, Shaohui
    Qu, Yanyun
    Wu, Haiyan
    Zhang, Zhizhong
    Xie, Yuan
    Yao, Angela
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1122 - 1128
  • [47] Weakly Supervised Object Detection Based on Feature Self-Distillation Mechanism
    Gao Wenlong
    Chen Ying
    Peng Yong
    LASER & OPTOELECTRONICS PROGRESS, 2023, 60 (04)
  • [48] Retinal vessel segmentation based on self-distillation and implicit neural representation
    Gu, Jia
    Tian, Fangzheng
    Oh, Il-Seok
    APPLIED INTELLIGENCE, 2023, 53 (12) : 15027 - 15044
  • [49] Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
    Roth, Karsten
    Milbich, Timo
    Ommer, Bjorn
    Cohen, Joseph Paul
    Ghassemi, Marzyeh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [50] Self-Distillation for Improving CTC-Transformer-based ASR Systems
    Moriya, Takafumi
    Ochiai, Tsubasa
    Karita, Shigeki
    Sato, Hiroshi
    Tanaka, Tomohiro
    Ashihara, Takanori
    Masumura, Ryo
    Shinohara, Yusuke
    Delcroix, Marc
    INTERSPEECH 2020, 2020, : 546 - 550