Improved Knowledge Distillation for Crowd Counting on IoT Devices

被引:1
|
作者
Huang, Zuo [1 ]
Sinnott, Richard O. [1 ]
机构
[1] Univ Melbourne, Sch Comp & Informat Syst, Parkville, Vic, Australia
关键词
Crowd counting; Deep learning; Knowledge distillation;
D O I
10.1109/EDGE60047.2023.00041
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Manual crowd counting for real-world problems is impossible or results in wildly inaccurate estimations. Deep learning is one area that has been applied to address this issue. Crowd counting is a computationally intensive task. Therefore, many crowd counting models employ large-scale deep convolutional neural networks (CNN) to achieve higher accuracy. However, these are typically at the cost of performance and inference speed. This makes such approaches difficult to apply in real-world settings, e.g., on Internet-of-Things (IoT) devices. To tackle this problem, one method is to compress models using pruning and quantization or use of lightweight model backbones. However, such methods often result in a significant loss of accuracy. To address this, some studies have explored knowledge distillation methods to extract useful information from large state-of-the-art (teacher) models to guide/train smaller (student) models. However, knowledge distillation methods suffer from the problem of information loss caused by hint-transformers. Furthermore, teacher models may have a negative impact on student models. In this work, we propose a method based on knowledge distillation that uses self-transformed hints and loss functions that ignore outliers to tackle real-world and challenging crowd counting tasks. Based on our approach, we achieve a MAE of 77.24 and a MSE of 276.17 using the JHU-CROWD++ [1] test set. This is comparable to state-of-the-art deep crowd counting models, but at a fraction of the original model size and complexity, thus making the solution suitable for IoT devices. The source code is available at https://github.com/huangzuo/effccdistilled.
引用
收藏
页码:207 / 214
页数:8
相关论文
共 50 条
  • [31] KD-Crowd: a knowledge distillation framework for learning from crowds
    Li, Shaoyuan
    Zheng, Yuxiang
    Shi, Ye
    Huang, Shengjun
    Chen, Songcan
    FRONTIERS OF COMPUTER SCIENCE, 2025, 19 (01)
  • [32] Improved Dense Crowd Counting Method based on Residual Neural Network
    Shi J.
    Zhou L.
    Lv G.
    Lin B.
    Journal of Geo-Information Science, 2021, 23 (09): : 1537 - 1547
  • [33] Leveraging angular distributions for improved knowledge distillation
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Turaga, Pavan
    NEUROCOMPUTING, 2023, 518 : 466 - 481
  • [34] Improved Knowledge Distillation via Teacher Assistant
    Mirzadeh, Seyed Iman
    Farajtabar, Mehrdad
    Li, Ang
    Levine, Nir
    Matsukawa, Akihiro
    Ghasemzadeh, Hassan
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5191 - 5198
  • [35] Shakey: an improved cipher for protection of Iot devices
    Sivakumar A.
    Sriwastawa A.
    Muthalagu R.
    International Journal of Information Technology, 2023, 15 (6) : 3381 - 3390
  • [36] Unsupervised Crowd Counting
    Elassal, Nada
    Elder, James H.
    COMPUTER VISION - ACCV 2016, PT V, 2017, 10115 : 329 - 345
  • [37] Counting the Crowd at a Carnival
    Pedersen, J. B.
    Markussen, J. B.
    Philipsen, M. P.
    Jensen, M. B.
    Moeslund, T. B.
    ADVANCES IN VISUAL COMPUTING (ISVC 2014), PT II, 2014, 8888 : 706 - 715
  • [38] FeDZIO: Decentralized Federated Knowledge Distillation on Edge Devices
    Palazzo, Luca
    Pennisi, Matteo
    Bellitto, Giovanni
    Kavasidis, Isaak
    IMAGE ANALYSIS AND PROCESSING - ICIAP 2023 WORKSHOPS, PT II, 2024, 14366 : 201 - 210
  • [39] Iterative Crowd Counting
    Ranjan, Viresh
    Le, Hieu
    Hoai, Minh
    COMPUTER VISION - ECCV 2018, PT VII, 2018, 11211 : 278 - 293
  • [40] A Crowd Counting Framework Combining with Crowd Location
    Zhang, Jin
    Chen, Sheng
    Tian, Sen
    Gong, Wenan
    Cai, Guoshan
    Wang, Ying
    JOURNAL OF ADVANCED TRANSPORTATION, 2021, 2021