Improved Knowledge Distillation for Crowd Counting on IoT Devices

被引:1
|
作者
Huang, Zuo [1 ]
Sinnott, Richard O. [1 ]
机构
[1] Univ Melbourne, Sch Comp & Informat Syst, Parkville, Vic, Australia
关键词
Crowd counting; Deep learning; Knowledge distillation;
D O I
10.1109/EDGE60047.2023.00041
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Manual crowd counting for real-world problems is impossible or results in wildly inaccurate estimations. Deep learning is one area that has been applied to address this issue. Crowd counting is a computationally intensive task. Therefore, many crowd counting models employ large-scale deep convolutional neural networks (CNN) to achieve higher accuracy. However, these are typically at the cost of performance and inference speed. This makes such approaches difficult to apply in real-world settings, e.g., on Internet-of-Things (IoT) devices. To tackle this problem, one method is to compress models using pruning and quantization or use of lightweight model backbones. However, such methods often result in a significant loss of accuracy. To address this, some studies have explored knowledge distillation methods to extract useful information from large state-of-the-art (teacher) models to guide/train smaller (student) models. However, knowledge distillation methods suffer from the problem of information loss caused by hint-transformers. Furthermore, teacher models may have a negative impact on student models. In this work, we propose a method based on knowledge distillation that uses self-transformed hints and loss functions that ignore outliers to tackle real-world and challenging crowd counting tasks. Based on our approach, we achieve a MAE of 77.24 and a MSE of 276.17 using the JHU-CROWD++ [1] test set. This is comparable to state-of-the-art deep crowd counting models, but at a fraction of the original model size and complexity, thus making the solution suitable for IoT devices. The source code is available at https://github.com/huangzuo/effccdistilled.
引用
收藏
页码:207 / 214
页数:8
相关论文
共 50 条
  • [41] Crowd counting in congested scene by CNN and Transformer Crowd counting for converged networks
    Lin, Yuanyuan
    Yang, Huicheng
    Hu, Yaocong
    Shuai, Zhen
    Li, Wenting
    PROCEEDINGS OF 2023 7TH INTERNATIONAL CONFERENCE ON ELECTRONIC INFORMATION TECHNOLOGY AND COMPUTER ENGINEERING, EITCE 2023, 2023, : 1092 - 1095
  • [42] A Lightweight Object Counting Network Based on Density Map Knowledge Distillation
    Shen, Zhilong
    Li, Guoquan
    Xia, Ruiyang
    Meng, Hongying
    Huang, Zhengwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1492 - 1505
  • [43] Wi-Monitor: Wi-Fi Channel State Information-Based Crowd Counting with Lightweight and Low-Cost IoT Devices
    Kitagishi, Takekazu
    Hangli, Ge
    Michikata, Takashi
    Koshizuka, Noboru
    INTERNET OF THINGS, GIOTS 2022, 2022, 13533 : 135 - 148
  • [44] Virtual Classification: Modulating Domain-Specific Knowledge for Multidomain Crowd Counting
    Guo, Mingyue
    Chen, Binghui
    Yan, Zhaoyi
    Wang, Yaowei
    Ye, Qixiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 2958 - 2972
  • [45] CrowdDCNN: Deep convolution neural network for real-time crowd counting on IoT edge
    Chavan, Rugved
    Kanamarlapudi, Aravind
    Rani, Geeta
    Thakkar, Priyam
    Dhaka, Vijaypal Singh
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 126
  • [46] Convolutional Neural Network Method for Crowd Counting Improved using Involution Operator
    Li Zhaoxin
    Lu Shuhua
    Lan Lingqiang
    Liu Qiyuan
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (18)
  • [47] Multimodal fusion and knowledge distillation for improved anomaly detection
    Lu, Meichen
    Chai, Yi
    Xu, Kaixiong
    Chen, Weiqing
    Ao, Fei
    Ji, Wen
    VISUAL COMPUTER, 2024,
  • [48] A Knowledge-Based Battery Controller for IoT Devices
    Canada-Bago, Joaquin
    Fernandez-Prieto, Jose-Angel
    JOURNAL OF SENSOR AND ACTUATOR NETWORKS, 2022, 11 (04)
  • [49] KNOWLEDGE DISTILLATION FOR IMPROVED ACCURACY IN SPOKEN QUESTION ANSWERING
    You, Chenyu
    Chen, Nuo
    Zou, Yuexian
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7793 - 7797
  • [50] Ensemble Knowledge Distillation for Learning Improved and Efficient Networks
    Asif, Umar
    Tang, Jianbin
    Harrer, Stefan
    ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 953 - 960