SHUFFLECOUNT: TASK-SPECIFIC KNOWLEDGE DISTILLATION FOR CROWD COUNTING

被引:5
|
作者
Jiang, Minyang [1 ]
Lin, Jianzhe [1 ]
Wang, Z. Jane [1 ]
机构
[1] Univ British Columbia, Dept Elect & Comp Engn, Vancouver, BC, Canada
关键词
D O I
10.1109/ICIP42928.2021.9506698
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
One promising way to improve the performance of a small deep network is knowledge distillation. Performances of smaller student models with fewer parameters and lower computational cost can be comparable to that of larger teacher models in specific computer vision tasks. Knowledge distillation is especially attractive for the high-accuracy real-time crowd counting task in our daily lives, where the computational resource can be limited and the model efficiency is extremely important. In this paper, we propose a novel task-specific knowledge distillation framework for crowd counting, named ShuffleCount. Its main contributions are two-fold: First, different from existing frameworks, our task-specific ShuffleCount effectively learns from the teacher network through hierarchic feature regulation, and better avoids negative knowledge transferred from the teacher. Second, the proposed student network, i.e., the optimized Shufflenet, shows promising performances. When tested on the benchmark dataset Shanghai Tech A, it achieves a 15% higher accuracy yet keeps low computational cost when compared with the state-of-the-art MobileCount. Our code is available online at https://github.com/JiangMinyang/CC-KD.
引用
收藏
页码:999 / 1003
页数:5
相关论文
共 50 条
  • [1] Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models
    Zhang, Kai
    Li, Jinqiu
    Wang, Bingqian
    Meng, Haoran
    [J]. Applied Sciences (Switzerland), 2024, 14 (20):
  • [2] Efficient Crowd Counting via Dual Knowledge Distillation
    Wang, Rui
    Hao, Yixue
    Hu, Long
    Li, Xianzhi
    Chen, Min
    Miao, Yiming
    Humar, Iztok
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 569 - 583
  • [3] Improved Knowledge Distillation for Crowd Counting on IoT Devices
    Huang, Zuo
    Sinnott, Richard O.
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND COMMUNICATIONS, EDGE, 2023, : 207 - 214
  • [4] Adversarial Data Augmentation for Task-Specific Knowledge Distillation of Pre-trained Transformers
    Zhng, Minjia
    Naresh, Niranjan Uma
    He, Yuxiong
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11685 - 11693
  • [5] A model of learning task-specific knowledge for a new task
    Taatgen, NA
    [J]. PROCEEDINGS OF THE TWENTY FIRST ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1999, : 730 - 735
  • [6] Repdistiller: Knowledge Distillation Scaled by Re-parameterization for Crowd Counting
    Ni, Tian
    Cao, Yuchen
    Liang, Xiaoyu
    Hu, Haoji
    [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT X, 2024, 14434 : 383 - 394
  • [7] Incorporating Task-Specific Concept Knowledge into Script Learning
    Sun, Chenkai
    Xu, Tie
    Zhai, ChengXiang
    Ji, Heng
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 3026 - 3040
  • [8] EXPERIENCE EFFECTS IN AUDITING - THE ROLE OF TASK-SPECIFIC KNOWLEDGE
    BONNER, SE
    [J]. ACCOUNTING REVIEW, 1990, 65 (01): : 72 - 92
  • [9] Case of task-specific tremor and task-specific dystonia with contralateral spread
    Jackson, L.
    Ali, F.
    [J]. MOVEMENT DISORDERS, 2019, 34
  • [10] Task-specific knowledge of the law of pendulum motion in children and adults
    Frick, A
    Huber, S
    Reips, UD
    Krist, H
    [J]. SWISS JOURNAL OF PSYCHOLOGY, 2005, 64 (02): : 103 - 114