Learning Efficient Object Detection Models with Knowledge Distillation

被引:0
|
作者
Chen, Guobin [1 ,2 ]
Choi, Wongun [1 ]
Yu, Xiang [1 ]
Han, Tony [2 ]
Chandraker, Manmohan [1 ,3 ]
机构
[1] NEC Labs Amer, Princeton, NJ 08540 USA
[2] Univ Missouri, Columbia, MO 65211 USA
[3] Univ Calif San Diego, La Jolla, CA 92093 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite significant accuracy improvement in convolutional neural networks (CNN) based object detectors, they often require prohibitive runtimes to process an image for real-time applications. State-of-the-art models often use very deep networks with a large number of floating point operations. Efforts such as model compression learn compact models with fewer number of parameters, but with much reduced accuracy. In this work, we propose a new framework to learn compact and fast object detection networks with improved accuracy using knowledge distillation [20] and hint learning [34]. Although knowledge distillation has demonstrated excellent improvements for simpler classification setups, the complexity of detection poses new challenges in the form of regression, region proposals and less voluminous labels. We address this through several innovations such as a weighted cross-entropy loss to address class imbalance, a teacher bounded loss to handle the regression component and adaptation layers to better learn from intermediate teacher distributions. We conduct comprehensive empirical evaluation with different distillation configurations over multiple datasets including PASCAL, KITTI, ILSVRC and MS-COCO. Our results show consistent improvement in accuracy-speed trade-offs for modern multi-class detection models.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Structured Knowledge Distillation for Accurate and Efficient Object Detection
    Zhang, Linfeng
    Ma, Kaisheng
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15706 - 15724
  • [2] Relation Knowledge Distillation by Auxiliary Learning for Object Detection
    Wang, Hao
    Jia, Tong
    Wang, Qilong
    Zuo, Wangmeng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 4796 - 4810
  • [3] Towards Efficient 3D Object Detection with Knowledge Distillation
    Yang, Jihan
    Shi, Shaoshuai
    Ding, Runyu
    Wang, Zhe
    Qi, Xiaojuan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Hybrid Deep Learning Vision-based Models for Human Object Interaction Detection by Knowledge Distillation
    Moutik, Oumaima
    Tigani, Smail
    Saadane, Rachid
    Chehri, Abdellah
    [J]. KNOWLEDGE-BASED AND INTELLIGENT INFORMATION & ENGINEERING SYSTEMS (KSE 2021), 2021, 192 : 5093 - 5103
  • [5] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    [J]. 11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [6] Scalability of knowledge distillation in incremental deep learning for fast object detection
    Yuwono, Elizabeth Irenne
    Tjondonegoro, Dian
    Sorwar, Golam
    Alaei, Alireza
    [J]. APPLIED SOFT COMPUTING, 2022, 129
  • [7] Incremental Deep Learning Method for Object Detection Model Based on Knowledge Distillation
    Fang W.
    Chen A.
    Meng N.
    Cheng H.
    Wang Q.
    [J]. Gongcheng Kexue Yu Jishu/Advanced Engineering Sciences, 2022, 54 (06): : 59 - 66
  • [8] One-stage object detection knowledge distillation via adversarial learning
    Na Dong
    Yongqiang Zhang
    Mingli Ding
    Shibiao Xu
    Yancheng Bai
    [J]. Applied Intelligence, 2022, 52 : 4582 - 4598
  • [9] One-stage object detection knowledge distillation via adversarial learning
    Dong, Na
    Zhang, Yongqiang
    Ding, Mingli
    Xu, Shibiao
    Bai, Yancheng
    [J]. APPLIED INTELLIGENCE, 2022, 52 (04) : 4582 - 4598
  • [10] Dual Relation Knowledge Distillation for Object Detection
    Ni, Zhen-Liang
    Yang, Fukui
    Wen, Shengzhao
    Zhang, Gang
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 1276 - 1284