Knowledge Distillation based Compact Model Learning Method for Object Detection

被引:0
|
作者
Ko, Jong Gook [1 ]
Yoo, Wonyoung [1 ]
机构
[1] ETRI Eletron Telecommun Res Inst, Content Res Div, Commun & Media Res Lab, Daejeon, South Korea
关键词
object detection; knowledge distillation; lightweight deep learning model;
D O I
10.1109/ictc49870.2020.9289463
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, video analysis technology through deep learning has been developing at a very rapid pace, and most of the technology related to improving recognition performance in server environment is being developed. However, in addition to video analysis technology in the existing server environment, the demand of object detection in visual image analysis have been increasing recently in embedded boards of low specification and mobile environments such as smartphones, drones, and industrial boards. Despite the significant improvement in the accuracy of existing object detectors, image processing for real-time applications often requires a lot of runtime. Therefore, many studies are being conducted on lightweight object detection technology, and knowledge distillation is one of the solutions. Efforts such as model compression use fewer parameters, but there is a problem that accuracy is significantly reduced. In this paper, we propose method to improve the performance of lightweight mobilenet-SSD models in object detection by using knowledge transfer methods. We conduct evaluation with PASCAL VOC dataset. Our results show detection accuracy improvement in object detection.
引用
收藏
页码:1276 / 1278
页数:3
相关论文
共 50 条
  • [1] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    [J]. Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [2] Learning Efficient Object Detection Models with Knowledge Distillation
    Chen, Guobin
    Choi, Wongun
    Yu, Xiang
    Han, Tony
    Chandraker, Manmohan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [3] Relation Knowledge Distillation by Auxiliary Learning for Object Detection
    Wang, Hao
    Jia, Tong
    Wang, Qilong
    Zuo, Wangmeng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 4796 - 4810
  • [4] Research on Object Detection Network Based on Knowledge Distillation
    Kuang, Hongbo
    Liu, Ziwei
    [J]. 2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021), 2021, : 8 - 12
  • [5] Knowledge Distillation for Object Detection Based on Mutual Information
    Liu, Xi
    Zhu, Ziqi
    [J]. 2021 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS 2021), 2021, : 18 - 23
  • [6] A light-weight object detection method based on knowledge distillation and model pruning for seam tracking system
    Zou, Yanbiao
    Liu, Chunyuan
    [J]. MEASUREMENT, 2023, 220
  • [7] Predictive Distillation Method of Anchor-Free Object Detection Model for Continual Learning
    Gang, Sumyung
    Chung, Daewon
    Lee, Joonjae
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (13):
  • [8] Scalability of knowledge distillation in incremental deep learning for fast object detection
    Yuwono, Elizabeth Irenne
    Tjondonegoro, Dian
    Sorwar, Golam
    Alaei, Alireza
    [J]. APPLIED SOFT COMPUTING, 2022, 129
  • [9] Defect Detection Method Based on Knowledge Distillation
    Zhou, Qunying
    Wang, Hongyuan
    Tang, Ying
    Wang, Yang
    [J]. IEEE ACCESS, 2023, 11 : 35866 - 35873
  • [10] Forest Fire Object Detection Analysis Based on Knowledge Distillation
    Xie, Jinzhou
    Zhao, Hongmin
    [J]. FIRE-SWITZERLAND, 2023, 6 (12):