Waste Classification by Fine-Tuning Pre-trained CNN and GAN

被引:5
|
作者
Alsabei, Amani [1 ]
Alsayed, Ashwaq [1 ]
Alzahrani, Manar [1 ]
Al-Shareef, Sarah [1 ]
机构
[1] Umm Al Qura Univ, Comp Sci Dept, Mecca, Saudi Arabia
关键词
deep learning; image classification; convolutional neural networks; transfer learning; waste classification; recycling;
D O I
10.22937/IJCSNS.2021.21.8.9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Waste accumulation is becoming a significant challenge in most urban areas and if it continues unchecked, is poised to have severe repercussions on our environment and health. The massive industrialisation in our cities has been followed by a commensurate waste creation that has become a bottleneck for even waste management systems. While recycling is a viable solution for waste management, it can be daunting to classify waste material for recycling accurately. In this study, transfer learning models were proposed to automatically classify wastes based on six materials (cardboard, glass, metal, paper, plastic, and trash). The tested pre-trained models were ResNet50, VGG16, InceptionV3, and Xception. Data augmentation was done using a Generative Adversarial Network (GAN) with various image generation percentages. It was found that models based on Xception and VGG16 were more robust. In contrast, models based on ResNet50 and InceptionV3 were sensitive to the added machine-generated images as the accuracy degrades significantly compared to training with no artificial data.
引用
收藏
页码:65 / 70
页数:6
相关论文
共 50 条
  • [1] Fine-tuning the hyperparameters of pre-trained models for solving multiclass classification problems
    Kaibassova, D.
    Nurtay, M.
    Tau, A.
    Kissina, M.
    [J]. COMPUTER OPTICS, 2022, 46 (06) : 971 - 979
  • [2] Pruning Pre-trained Language ModelsWithout Fine-Tuning
    Jiang, Ting
    Wang, Deqing
    Zhuang, Fuzhen
    Xie, Ruobing
    Xia, Feng
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 594 - 605
  • [3] Span Fine-tuning for Pre-trained Language Models
    Bao, Rongzhou
    Zhang, Zhuosheng
    Zhao, Hai
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1970 - 1979
  • [4] Overcoming Catastrophic Forgetting for Fine-Tuning Pre-trained GANs
    Zhang, Zeren
    Li, Xingjian
    Hong, Tao
    Wang, Tianyang
    Ma, Jinwen
    Xiong, Haoyi
    Xu, Cheng-Zhong
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT V, 2023, 14173 : 293 - 308
  • [5] Enhancing Alzheimer's Disease Classification with Transfer Learning: Fine-tuning a Pre-trained Algorithm
    Boudi, Abdelmounim
    He, Jingfei
    Abd El Kader, Isselmou
    [J]. CURRENT MEDICAL IMAGING, 2024,
  • [6] Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets
    Newton Spolaôr
    Huei Diana Lee
    Ana Isabel Mendes
    Conceição Veloso Nogueira
    Antonio Rafael Sabino Parmezan
    Weber Shoity Resende Takaki
    Claudio Saddy Rodrigues Coy
    Feng Chung Wu
    Rui Fonseca-Pinto
    [J]. Multimedia Tools and Applications, 2024, 83 (9) : 27305 - 27329
  • [7] Fine-tuning pre-trained neural networks for medical image classification in small clinical datasets
    Spolaor, Newton
    Lee, Huei Diana
    Mendes, Ana Isabel
    Nogueira, Conceicao Veloso
    Sabino Parmezan, Antonio Rafael
    Resende Takaki, Weber Shoity
    Rodrigues Coy, Claudio Saddy
    Wu, Feng Chung
    Fonseca-Pinto, Rui
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (09) : 27305 - 27329
  • [8] Variational Monte Carlo on a Budget - Fine-tuning pre-trained NeuralWavefunctions
    Scherbela, Michael
    Gerard, Leon
    Grohs, Philipp
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Debiasing Pre-Trained Language Models via Efficient Fine-Tuning
    Gira, Michael
    Zhang, Ruisu
    Lee, Kangwook
    [J]. PROCEEDINGS OF THE SECOND WORKSHOP ON LANGUAGE TECHNOLOGY FOR EQUALITY, DIVERSITY AND INCLUSION (LTEDI 2022), 2022, : 59 - 69
  • [10] Fine-Tuning Pre-Trained CodeBERT for Code Search in Smart Contract
    JIN Huan
    LI Qinying
    [J]. Wuhan University Journal of Natural Sciences, 2023, 28 (03) : 237 - 245