ITERATIVE SELF KNOWLEDGE DISTILLATION - FROM POTHOLE CLASSIFICATION TO FINE-GRAINED AND COVID RECOGNITION

被引:1
|
作者
Peng, Kuan-Chuan [1 ]
机构
[1] Mitsubishi Elect Res Labs MERL, Cambridge, MA 02139 USA
关键词
Teacher-free knowledge distillation; iterative self knowledge distillation;
D O I
10.1109/ICASSP43922.2022.9746470
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Pothole classification has become an important task for road inspection vehicles to save drivers from potential car accidents and repair bills. Given the limited computational power and fixed number of training epochs, we propose iterative self knowledge distillation (ISKD) to train lightweight pothole classifiers. Designed to improve both the teacher and student models over time in knowledge distillation, ISKD outperforms the state-of-the-art self knowledge distillation method on three pothole classification datasets across four lightweight network architectures, which supports that self knowledge distillation should be done iteratively instead of just once. The accuracy relation between the teacher and student models shows that the student model can still benefit from a moderately trained teacher model. Implying that better teacher models generally produce better student models, our results justify the design of ISKD. In addition to pothole classification, we also demonstrate the efficacy of ISKD on six additional datasets associated with generic classification, fine-grained classification, and medical imaging application, which supports that ISKD can serve as a general-purpose performance booster without the need of a given teacher model and extra trainable parameters.
引用
收藏
页码:3139 / 3143
页数:5
相关论文
共 50 条
  • [31] Research on plant seeds recognition based on fine-grained image classification
    Yuan, Min
    Dong, Yongkang
    Lu, Fuxiang
    Zhan, Kun
    Zhu, Liye
    Shen, Jiacheng
    Ren, Dingbang
    Hu, Xiaowen
    Lv, Ningning
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (05)
  • [32] Transformer with peak suppression and knowledge guidance for fine-grained image recognition
    Liu, Xinda
    Wang, Lili
    Han, Xiaoguang
    NEUROCOMPUTING, 2022, 492 : 137 - 149
  • [33] A Streamlined Attention Mechanism for Image Classification and Fine-Grained Visual Recognition
    Dakshayani Himabindu D.
    Praveen Kumar S.
    Dakshayani Himabindu, D. (dakshayanihimabindu_d@vnrvjiet.in), 1600, Brno University of Technology (27): : 59 - 67
  • [34] Deep LAC: Deep Localization, Alignment and Classification for Fine-grained Recognition
    Lin, Di
    Shen, Xiaoyong
    Lu, Cewu
    Jia, Jiaya
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 1666 - 1674
  • [35] Hierarchical classification based on coarse- to fine-grained knowledge transfer
    Qiu, Zeyu
    Hu, Minjie
    Zhao, Hong
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2022, 149 : 61 - 69
  • [36] Hierarchical Fine-Grained Visual Classification Leveraging Consistent Hierarchical Knowledge
    Liu, Yuting
    Yang, Liu
    Wang, Yu
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT I, ECML PKDD 2024, 2024, 14941 : 279 - 295
  • [37] EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation
    Zhang, Cheng
    Liu, Chunqing
    Gong, Huimin
    Teng, Jinlin
    PLOS ONE, 2024, 19 (02):
  • [38] FINE-GRAINED AND LAYERED OBJECT RECOGNITION
    Wu, Yang
    Zheng, Nanning
    Liu, Yuanliu
    Yuan, Zejian
    INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2012, 26 (02)
  • [39] SELECTIVE PARTS FOR FINE-GRAINED RECOGNITION
    Li, Dong
    Li, Yali
    Wang, Shengjin
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 922 - 926
  • [40] Video Pose Distillation for Few-Shot, Fine-Grained Sports Action Recognition
    Hong, James
    Fisher, Matthew
    Gharbi, Michael
    Fatahalian, Kayvon
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9234 - 9243