Improving Deep Mutual Learning via Knowledge Distillation

被引:2
|
作者
Lukman, Achmad [1 ]
Yang, Chuan-Kai [1 ]
机构
[1] Natl Taiwan Univ Sci & Technol, Sch Management, Dept Informat Management, Taipei 106335, Taiwan
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 15期
关键词
image classification; knowledge distillation; mutual learning; convolutional neural network;
D O I
10.3390/app12157916
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Knowledge transfer has become very popular in recent years, and it is either based on a one-way transfer method used with knowledge distillation or based on a two-way knowledge transfer implemented by deep mutual learning, while both of them adopt a teacher-student paradigm. A one-way based method is more simple and compact because it only involves an untrained low-capacity student and a high-capacity teacher network in the knowledge transfer process. In contrast, a two-way based method requires more training costs because it involves two or more low-cost network capacities from scratch simultaneously to obtain better accuracy results for each network. In this paper, we propose two new approaches, namely full deep distillation mutual learning (FDDML) and half deep distillation mutual learning (HDDML), and improve convolutional neural network performance. These approaches work with three losses by using variations of existing network architectures, and the experiments have been conducted on three public benchmark datasets. We test our method on some existing KT task methods, showing its performance over related methods.
引用
收藏
页数:16
相关论文
共 50 条
  • [11] Improving Knowledge Distillation via Head and Tail Categories
    Xu, Liuchi
    Ren, Jin
    Huang, Zhenhua
    Zheng, Weishi
    Chen, Yunwen
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (05) : 3465 - 3480
  • [12] Performance-Aware Mutual Knowledge Distillation for Improving Neural Architecture Search
    Xie, Pengtao
    Du, Xuefeng
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11912 - 11922
  • [13] MDFlow: Unsupervised Optical Flow Learning by Reliable Mutual Knowledge Distillation
    Kong, Lingtong
    Yang, Jie
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (02) : 677 - 688
  • [14] IMPROVING IQA PERFORMANCE BASED ON DEEP MUTUAL LEARNING
    Yue, Guanghui
    Cheng, Di
    Wu, Honglv
    Jiang, Qiuping
    Wang, Tianfu
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 2182 - 2186
  • [15] Improving deep metric learning via self-distillation and online batch diffusion process
    Zelong Zeng
    Fan Yang
    Hong Liu
    Shin’ichi Satoh
    [J]. Visual Intelligence, 2 (1):
  • [16] Training Deep Face Recognition for Efficient Inference by Distillation and Mutual Learning
    Shen, Guodong
    Shen, Yao
    RiaZ, M. Naveed
    [J]. PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), 2018, : 38 - 43
  • [17] Efficient pavement distress classification via deep patch soft selective learning and knowledge distillation
    Zhang, Shizheng
    Tang, Wenhao
    Wang, Jing
    Huang, Sheng
    [J]. ELECTRONICS LETTERS, 2022, 58 (18) : 693 - 695
  • [18] Screening obstructive sleep apnea patients via deep learning of knowledge distillation in the lateral cephalogram
    Min-Jung Kim
    Jiheon Jeong
    Jung-Wook Lee
    In-Hwan Kim
    Jae-Woo Park
    Jae-Yon Roh
    Namkug Kim
    Su-Jung Kim
    [J]. Scientific Reports, 13
  • [19] Screening obstructive sleep apnea patients via deep learning of knowledge distillation in the lateral cephalogram
    Kim, Min-Jung
    Jeong, Jiheon
    Lee, Jung-Wook
    Kim, In-Hwan
    Park, Jae-Woo
    Roh, Jae-Yon
    Kim, Namkug
    Kim, Su-Jung
    [J]. SCIENTIFIC REPORTS, 2023, 13 (01)
  • [20] Federated Learning via Augmented Knowledge Distillation for Heterogenous Deep Human Activity Recognition Systems
    Gad, Gad
    Fadlullah, Zubair
    [J]. SENSORS, 2023, 23 (01)