ROBUST AND ACCURATE OBJECT DETECTION VIA SELF-KNOWLEDGE DISTILLATION

被引:0
|
作者
Xu, Weipeng [1 ]
Chu, Pengzhi [1 ]
Xie, Renhao [1 ]
Xiao, Xiongziyan [1 ]
Huang, Hongcheng [1 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
关键词
deep learning; object detection; adversarial robustness; knowledge distillation;
D O I
10.1109/ICIP46576.2022.9898031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Object detection has achieved promising performance on clean datasets, but how to achieve better tradeoff between the adversarial robustness and clean precision is still underexplored. Adversarial training is the mainstream method to improve robustness, but most of the works will sacrifice clean precision to gain robustness than standard training. In this paper, we propose Unified Decoupled Feature Alignment (UDFA), a novel fine-tuning paradigm which achieves better performance than existing methods, by fully exploring the combination between self-knowledge distillation and adversarial training for object detection. With extensive experiments on the PASCAL-VOC and MS-COCO benchmarks, the evaluation results show that UDFA can surpass the standard training and state-of-the-art adversarial training methods for object detection. For example, compared with teacher detector, our approach on GFLV2 with ResNet-50 improves clean precision by 2.2 AP on PASCAL-VOC; compared with SOTA adversarial training methods, our approach improves clean precision by 1.6 AP, while improving adversarial robustness by 0.5 AP. Our code is available at https://github.com/grispeut/udfa.
引用
收藏
页码:91 / 95
页数:5
相关论文
共 50 条
  • [1] Self-knowledge distillation via dropout
    Lee, Hyoje
    Park, Yeachan
    Seo, Hyun
    Kang, Myungjoo
    [J]. COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233
  • [2] Spatial likelihood voting with self-knowledge distillation for weakly supervised object detection
    Chen, Ze
    Fu, Zhihang
    Huang, Jianqiang
    Tao, Mingyuan
    Jiang, Rongxin
    Tian, Xiang
    Chen, Yaowu
    Hua, Xian-Sheng
    [J]. IMAGE AND VISION COMPUTING, 2021, 116
  • [3] Self-Knowledge Distillation via Progressive Associative Learning
    Zhao, Haoran
    Bi, Yanxian
    Tian, Shuwen
    Wang, Jian
    Zhang, Peiying
    Deng, Zhaopeng
    Liu, Kai
    [J]. ELECTRONICS, 2024, 13 (11)
  • [4] Neighbor self-knowledge distillation
    Liang, Peng
    Zhang, Weiwei
    Wang, Junhuang
    Guo, Yufeng
    [J]. INFORMATION SCIENCES, 2024, 654
  • [5] Structured Knowledge Distillation for Accurate and Efficient Object Detection
    Zhang, Linfeng
    Ma, Kaisheng
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15706 - 15724
  • [6] SELF-KNOWLEDGE DISTILLATION VIA FEATURE ENHANCEMENT FOR SPEAKER VERIFICATION
    Liu, Bei
    Wang, Haoyu
    Chen, Zhengyang
    Wang, Shuai
    Qian, Yanmin
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 7542 - 7546
  • [7] Personalized Edge Intelligence via Federated Self-Knowledge Distillation
    Jin, Hai
    Bai, Dongshan
    Yao, Dezhong
    Dai, Yutong
    Gu, Lin
    Yu, Chen
    Sun, Lichao
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (02) : 567 - 580
  • [8] Automatic Diabetic Retinopathy Grading via Self-Knowledge Distillation
    Luo, Ling
    Xue, Dingyu
    Feng, Xinglong
    [J]. ELECTRONICS, 2020, 9 (09) : 1 - 13
  • [9] Assessment of accurate self-knowledge
    Vogt, DS
    Colvin, CR
    [J]. JOURNAL OF PERSONALITY ASSESSMENT, 2005, 84 (03) : 239 - 251
  • [10] Dual teachers for self-knowledge distillation
    Li, Zheng
    Li, Xiang
    Yang, Lingfeng
    Song, Renjie
    Yang, Jian
    Pan, Zhigeng
    [J]. PATTERN RECOGNITION, 2024, 151