Image classification based on self-distillation

被引:2
|
作者
Li, Yuting [1 ]
Qing, Linbo [1 ]
He, Xiaohai [1 ]
Chen, Honggang [1 ]
Liu, Qiang [1 ]
机构
[1] Sichuan Univ, Coll Elect & Informat Engn, 24 South Sect 1,Yihuan Rd, Chengdu 610065, Peoples R China
基金
中国国家自然科学基金;
关键词
Image classification; Self-distillation; Attention; FUSION;
D O I
10.1007/s10489-022-04008-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks have been widely used in various application scenarios. To extend the application to some areas where accuracy is critical, researchers have been investigating methods to improve accuracy using deeper or broader network structures, which creates exponential growth in computation and storage costs and delays in response time. In this paper, we propose a self-distillation image classification algorithm that significantly improves performance while decreasing training costs. In traditional self-distillation, the student model needs to improve its ability to acquire global information and focus on key features due to the lack of guidance from the teacher model. For this reason, we improved the traditional self-distillation algorithm by using a positional attention module and a residual block with attention. Experimental results show that the method achieves better performance compared with traditional knowledge distillation methods and attention networks.
引用
收藏
页码:9396 / 9408
页数:13
相关论文
共 50 条
  • [31] Adaptive Similarity Bootstrapping for Self-Distillation based Representation Learning
    Lebailly, Tim
    Stegmueller, Thomas
    Bozorgtabar, Behzad
    Thiran, Jean-Philippe
    Tuytelaars, Tinne
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 16459 - 16468
  • [32] Dynamic image super-resolution via progressive contrastive self-distillation
    Zhang, Zhizhong
    Xie, Yuan
    Zhang, Chong
    Wang, Yanbo
    Qu, Yanyun
    Lin, Shaohui
    Ma, Lizhuang
    Tian, Qi
    PATTERN RECOGNITION, 2024, 153
  • [33] SKZC: self-distillation and k-nearest neighbor-based zero-shot classification
    Sun, Muyang
    Jia, Haitao
    Journal of Engineering and Applied Science, 2024, 71 (01):
  • [34] MaskCLIP: Masked Self-Distillation Advances Contrastive Language-Image Pretraining
    Dong, Xiaoyi
    Bao, Jianmin
    Zheng, Yinglin
    Zhang, Ting
    Chen, Dongdong
    Yang, Hao
    Zeng, Ming
    Zhang, Weiming
    Yuan, Lu
    Chen, Dong
    Wen, Fang
    Yu, Nenghai
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10995 - 11005
  • [35] LaST: Label-Free Self-Distillation Contrastive Learning With Transformer Architecture for Remote Sensing Image Scene Classification
    Wang, Xuying
    Zhu, Jiawei
    Yan, Zhengliang
    Zhang, Zhaoyang
    Zhang, Yunsheng
    Chen, Yansheng
    Li, Haifeng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [36] Physical-model guided self-distillation network for single image dehazing
    Lan, Yunwei
    Cui, Zhigao
    Su, Yanzhao
    Wang, Nian
    Li, Aihua
    Han, Deshuai
    FRONTIERS IN NEUROROBOTICS, 2022, 16
  • [37] Positional normalization-based mixed-image data augmentation and ensemble self-distillation algorithm
    Chen, Wenjie
    Hu, Yunbing
    Peng, Min
    Zhu, Bowen
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 252
  • [38] Self-Distillation for Randomized Neural Networks
    Hu, Minghui
    Gao, Ruobin
    Suganthan, Ponnuthurai Nagaratnam
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (11) : 1 - 10
  • [39] Negative Instance Guided Self-Distillation Framework for Whole Slide Image Analysis
    Luo, Xiaoyuan
    Qu, Linhao
    Guo, Qinhao
    Song, Zhijian
    Wang, Manning
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2024, 28 (02) : 964 - 975
  • [40] Learn by Yourself: A Feature-Augmented Self-Distillation Convolutional Neural Network for Remote Sensing Scene Image Classification
    Shi, Cuiping
    Ding, Mengxiang
    Wang, Liguo
    Pan, Haizhu
    REMOTE SENSING, 2023, 15 (23)