Shared Knowledge Distillation Network for Object Detection

被引:0
|
作者
Guo, Zhen [1 ,2 ]
Zhang, Pengzhou [1 ]
Liang, Peng [2 ]
机构
[1] Commun Univ China, State Key Lab Media Convergence & Commun, Beijing 100024, Peoples R China
[2] China Unicom Smart City Res Inst, Beijing 100048, Peoples R China
关键词
shared knowledge network; knowledge distillation; object detection; cross-layer distillation;
D O I
10.3390/electronics13081595
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Object detection based on Knowledge Distillation can enhance the capabilities and performance of 5G and 6G networks in various domains, such as autonomous vehicles, smart surveillance, and augmented reality. The integration of object detection with Knowledge Distillation techniques is expected to play a pivotal role in realizing the full potential of these networks. This study presents Shared Knowledge Distillation (Shared-KD) as a solution to overcome optimization challenges caused by disparities in cross-layer features between teacher-student networks. The significant gaps in intermediate-level features between teachers and students present a considerable obstacle to the efficacy of distillation. To tackle this issue, we draw inspiration from collaborative learning in real-world education, where teachers work together to prepare lessons and students engage in peer learning. Building upon this concept, our innovative contributions in model construction are highlighted as follows: (1) A teacher knowledge augmentation module: this module is proposed to combine lower-level teacher features, facilitating the knowledge transfer from the teacher to the student. (2) A student mutual learning module is introduced to enable students to learn from each other, mimicking the peer learning concept in collaborative learning. (3) The Teacher Share Module combines lower-level teacher features: the specific functionality of the teacher knowledge augmentation module is described, which involves combining lower-level teacher features. (4) The multi-step transfer process can be easily optimized due to the minimal gap between the features: the proposed approach breaks down the knowledge transfer process into multiple steps, which can be easily optimized due to the minimal gap between the features involved in each step. Shared-KD uses simple feature losses without additional weights in transformation, resulting in an efficient distillation process that can be easily combined with other methods for further improvement. The effectiveness of our approach is validated through experiments on popular tasks such as object detection and instance segmentation.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] A Light-Weight CNN for Object Detection with Sparse Model and Knowledge Distillation
    Guo, Jing-Ming
    Yang, Jr-Sheng
    Seshathiri, Sankarasrinivasan
    Wu, Hung-Wei
    ELECTRONICS, 2022, 11 (04)
  • [42] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [43] Directional Alignment Instance Knowledge Distillation for Arbitrary-Oriented Object Detection
    Wang, Ao
    Wang, Hao
    Huang, Zhanchao
    Zhao, Boya
    Li, Wei
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [44] Squeezed Deep 6DoF Object Detection using Knowledge Distillation
    Felix, Heitor
    Rodrigues, Walber M.
    Macedo, David
    Simoes, Francisco
    Oliveira, Adriano L., I
    Teichrieb, Veronica
    Zanchettin, Cleber
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [45] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    Applied Intelligence, 2022, 52 : 4582 - 4598
  • [46] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598
  • [47] KNOWLEDGE-BASED REASONING NETWORK FOR OBJECT DETECTION
    Zhang, Huigang
    Wang, Liuan
    Sun, Jun
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 1579 - 1583
  • [48] Learning Slimming SAR Ship Object Detector Through Network Pruning and Knowledge Distillation
    Chen, Shiqi
    Zhan, Ronghui
    Wang, Wei
    Zhang, Jun
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 1267 - 1282
  • [49] LCH: fast RGB-D salient object detection on CPU via lightweight convolutional network with hybrid knowledge distillation
    Wang, Binglu
    Zhang, Fan
    Zhao, Yongqiang
    VISUAL COMPUTER, 2024, 40 (03): : 1997 - 2014
  • [50] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828