ITERATIVE SELF KNOWLEDGE DISTILLATION - FROM POTHOLE CLASSIFICATION TO FINE-GRAINED AND COVID RECOGNITION

被引:1
|
作者
Peng, Kuan-Chuan [1 ]
机构
[1] Mitsubishi Elect Res Labs MERL, Cambridge, MA 02139 USA
关键词
Teacher-free knowledge distillation; iterative self knowledge distillation;
D O I
10.1109/ICASSP43922.2022.9746470
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Pothole classification has become an important task for road inspection vehicles to save drivers from potential car accidents and repair bills. Given the limited computational power and fixed number of training epochs, we propose iterative self knowledge distillation (ISKD) to train lightweight pothole classifiers. Designed to improve both the teacher and student models over time in knowledge distillation, ISKD outperforms the state-of-the-art self knowledge distillation method on three pothole classification datasets across four lightweight network architectures, which supports that self knowledge distillation should be done iteratively instead of just once. The accuracy relation between the teacher and student models shows that the student model can still benefit from a moderately trained teacher model. Implying that better teacher models generally produce better student models, our results justify the design of ISKD. In addition to pothole classification, we also demonstrate the efficacy of ISKD on six additional datasets associated with generic classification, fine-grained classification, and medical imaging application, which supports that ISKD can serve as a general-purpose performance booster without the need of a given teacher model and extra trainable parameters.
引用
收藏
页码:3139 / 3143
页数:5
相关论文
共 50 条
  • [1] Deep Ensemble Learning by Diverse Knowledge Distillation for Fine-Grained Object Classification
    Okamoto, Naoki
    Hirakawa, Tsubasa
    Yamashita, Takayoshi
    Fujiyoshi, Hironobu
    COMPUTER VISION, ECCV 2022, PT XI, 2022, 13671 : 502 - 518
  • [2] A Fine-Grained Bird Classification Method Based on Attention and Decoupled Knowledge Distillation
    Wang, Kang
    Yang, Feng
    Chen, Zhibo
    Chen, Yixin
    Zhang, Ying
    ANIMALS, 2023, 13 (02):
  • [3] WebChild 2.0: Fine-Grained Commonsense Knowledge Distillation
    Tandon, Niket
    de Melo, Gerard
    Weikum, Gerhard
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017): SYSTEM DEMONSTRATIONS, 2017, : 115 - 120
  • [4] Fine-Grained Crowdsourcing for Fine-Grained Recognition
    Jia Deng
    Krause, Jonathan
    Li Fei-Fei
    2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 580 - 587
  • [5] Towards Fine-Grained Recognition: Joint Learning for Object Detection and Fine-Grained Classification
    Wang, Qiaosong
    Rasmussen, Christopher
    ADVANCES IN VISUAL COMPUTING, ISVC 2019, PT II, 2019, 11845 : 332 - 344
  • [6] CROSS-MODAL KNOWLEDGE DISTILLATION FOR FINE-GRAINED ONE-SHOT CLASSIFICATION
    Zhao, Jiabao
    Lin, Xin
    Yang, Yifan
    Yang, Jing
    He, Liang
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 4295 - 4299
  • [7] ITERATIVE OBJECT AND PART TRANSFER FOR FINE-GRAINED RECOGNITION
    Shen, Zhiqiang
    Jiang, Yu-Gang
    Wang, Dequan
    Xue, Xiangyang
    2017 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2017, : 1470 - 1475
  • [8] Fine-Grained Argument Unit Recognition and Classification
    Trautmann, Dietrich
    Daxenberger, Johannes
    Stab, Christian
    Schuetze, Hinrich
    Gurevych, Iryna
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9048 - 9056
  • [9] Multi-Stage Training with Multi-Level Knowledge Self-Distillation for Fine-Grained Image Recognition
    Yu Y.
    Wei W.
    Tang H.
    Qian J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (08): : 1834 - 1845
  • [10] Distill-AER: Fine-Grained Address Entity Recognition from Spoken Dialogue via Knowledge Distillation
    Wang, Yitong
    Han, Xue
    Zhou, Feng
    Wang, Yiting
    Deng, Chao
    Feng, Junlan
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 643 - 655