Focus-RCNet: a lightweight recyclable waste classification algorithm based on focus and knowledge distillation

被引:2
|
作者
Zheng, Dashun [1 ]
Wang, Rongsheng [1 ]
Duan, Yaofei [1 ]
Pang, Patrick Cheong-Iao [1 ]
Tan, Tao [1 ]
机构
[1] Macao Polytech Univ, Fac Appl Sci, Rua Luis Gonzaga Gomes, Macau 999078, Peoples R China
关键词
Waste recycling; Waste classification; Knowledge distillation; Lightweight; Attention;
D O I
10.1186/s42492-023-00146-3
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Waste pollution is a significant environmental problem worldwide. With the continuous improvement in the living standards of the population and increasing richness of the consumption structure, the amount of domestic waste generated has increased dramatically, and there is an urgent need for further treatment. The rapid development of artificial intelligence has provided an effective solution for automated waste classification. However, the high computational power and complexity of algorithms make convolutional neural networks unsuitable for real-time embedded applications. In this paper, we propose a lightweight network architecture called Focus-RCNet, designed with reference to the sandglass structure of MobileNetV2, which uses deeply separable convolution to extract features from images. The Focus module is introduced to the field of recyclable waste image classification to reduce the dimensionality of features while retaining relevant information. To make the model focus more on waste image features while keeping the number of parameters small, we introduce the SimAM attention mechanism. In addition, knowledge distillation was used to further compress the number of parameters in the model. By training and testing on the TrashNet dataset, the Focus-RCNet model not only achieved an accuracy of 92% but also showed high deployment mobility.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] A Lightweight Deep Learning Algorithm for Multi-Objective Detection of Recyclable Domestic Waste
    Wu, Qunbiao
    Liang, Tao
    Fang, Haifeng
    Wei, Yangyang
    Wang, Mingqiang
    He, Defang
    ENVIRONMENTAL ENGINEERING SCIENCE, 2023, 40 (12) : 667 - 677
  • [32] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [33] Contrastive Knowledge Distillation for Anomaly Detection in Multi-Illumination/Focus Display Images
    Lee, Jihyun
    Park, Hangil
    Seo, Yongmin
    Min, Taewon
    Yun, Joodong
    Kim, Jaewon
    Kim, Tae-Kyun
    2023 18TH INTERNATIONAL CONFERENCE ON MACHINE VISION AND APPLICATIONS, MVA, 2023,
  • [34] Ensemble Learning of Lightweight Deep Learning Models Using Knowledge Distillation for Image Classification
    Kang, Jaeyong
    Gwak, Jeonghwan
    MATHEMATICS, 2020, 8 (10)
  • [35] A knowledge distillation strategy for enhancing the adversarial robustness of lightweight automatic modulation classification models
    Xu, Fanghao
    Wang, Chao
    Liang, Jiakai
    Zuo, Chenyang
    Yue, Keqiang
    Li, Wenjun
    IET COMMUNICATIONS, 2024, 18 (14) : 827 - 845
  • [36] Lightweight Edge-side Fault Diagnosis Based on Knowledge Distillation
    Shang, Yingjun
    Feng, Tao
    Huo, Yonghua
    Duan, Yongcun
    Long, Yuhan
    2022 IEEE 14TH INTERNATIONAL CONFERENCE ON ADVANCED INFOCOMM TECHNOLOGY (ICAIT 2022), 2022, : 348 - 353
  • [37] A Lightweight Pipeline Edge Detection Model Based on Heterogeneous Knowledge Distillation
    Zhu, Chengyuan
    Pu, Yanyun
    Lyu, Zhuoling
    Wu, Aonan
    Yang, Kaixiang
    Yang, Qinmin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (12) : 5059 - 5063
  • [38] A lightweight method for Android malware classification based on teacher assistant distillation
    Tang, Junwei
    Pi, Qiaosen
    Huang, Jin
    He, Ruhan
    Peng, Tao
    Hu, Xinrong
    Tian, Wenlong
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 819 - 824
  • [39] Research on a lightweight electronic component detection method based on knowledge distillation
    Xia, Zilin
    Gu, Jinan
    Wang, Wenbo
    Huang, Zedong
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (12) : 20971 - 20994
  • [40] A Lightweight Object Counting Network Based on Density Map Knowledge Distillation
    Shen, Zhilong
    Li, Guoquan
    Xia, Ruiyang
    Meng, Hongying
    Huang, Zhengwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1492 - 1505