Shared Knowledge Distillation Network for Object Detection

被引:0
|
作者
Guo, Zhen [1 ,2 ]
Zhang, Pengzhou [1 ]
Liang, Peng [2 ]
机构
[1] Commun Univ China, State Key Lab Media Convergence & Commun, Beijing 100024, Peoples R China
[2] China Unicom Smart City Res Inst, Beijing 100048, Peoples R China
关键词
shared knowledge network; knowledge distillation; object detection; cross-layer distillation;
D O I
10.3390/electronics13081595
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Object detection based on Knowledge Distillation can enhance the capabilities and performance of 5G and 6G networks in various domains, such as autonomous vehicles, smart surveillance, and augmented reality. The integration of object detection with Knowledge Distillation techniques is expected to play a pivotal role in realizing the full potential of these networks. This study presents Shared Knowledge Distillation (Shared-KD) as a solution to overcome optimization challenges caused by disparities in cross-layer features between teacher-student networks. The significant gaps in intermediate-level features between teachers and students present a considerable obstacle to the efficacy of distillation. To tackle this issue, we draw inspiration from collaborative learning in real-world education, where teachers work together to prepare lessons and students engage in peer learning. Building upon this concept, our innovative contributions in model construction are highlighted as follows: (1) A teacher knowledge augmentation module: this module is proposed to combine lower-level teacher features, facilitating the knowledge transfer from the teacher to the student. (2) A student mutual learning module is introduced to enable students to learn from each other, mimicking the peer learning concept in collaborative learning. (3) The Teacher Share Module combines lower-level teacher features: the specific functionality of the teacher knowledge augmentation module is described, which involves combining lower-level teacher features. (4) The multi-step transfer process can be easily optimized due to the minimal gap between the features: the proposed approach breaks down the knowledge transfer process into multiple steps, which can be easily optimized due to the minimal gap between the features involved in each step. Shared-KD uses simple feature losses without additional weights in transformation, resulting in an efficient distillation process that can be easily combined with other methods for further improvement. The effectiveness of our approach is validated through experiments on popular tasks such as object detection and instance segmentation.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Research on Object Detection Network Based on Knowledge Distillation
    Kuang, Hongbo
    Liu, Ziwei
    [J]. 2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021), 2021, : 8 - 12
  • [2] Context-aware knowledge distillation network for object detection
    Chu, Jing-Hui
    Shi, Li-Dong
    Jing, Pei-Guang
    Lv, Wei
    [J]. Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2022, 56 (03): : 503 - 509
  • [3] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284
  • [4] New Knowledge Distillation for Incremental Object Detection
    Chen, Li
    Yu, Chunyan
    Chen, Lvcai
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [5] Foreground separation knowledge distillation for object detection
    Li, Chao
    Liu, Rugui
    Quan, Zhe
    Hu, Pengpeng
    Sun, Jun
    [J]. PeerJ Computer Science, 2024, 10
  • [6] WaveNet: Wavelet Network With Knowledge Distillation for RGB-T Salient Object Detection
    Zhou, Wujie
    Sun, Fan
    Jiang, Qiuping
    Cong, Runmin
    Hwang, Jenq-Neng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 3027 - 3039
  • [7] Learning Efficient Object Detection Models with Knowledge Distillation
    Chen, Guobin
    Choi, Wongun
    Yu, Xiang
    Han, Tony
    Chandraker, Manmohan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [8] Cross-Weighting Knowledge Distillation for Object Detection
    Li, Zhaoyi
    Li, Zihao
    Yue, Xiaodong
    [J]. ROUGH SETS, PT I, IJCRS 2024, 2024, 14839 : 285 - 299
  • [9] EXPLORING EFFECTIVE KNOWLEDGE DISTILLATION FOR TINY OBJECT DETECTION
    Liu, Haotian
    Liu, Qing
    Liu, Yang
    Liang, Yixiong
    Zhao, Guoying
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 770 - 774
  • [10] Knowledge Distillation for Object Detection Based on Mutual Information
    Liu, Xi
    Zhu, Ziqi
    [J]. 2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021), 2021, : 18 - 23