Waste Classification by Fine-Tuning Pre-trained CNN and GAN

被引:5
|
作者
Alsabei, Amani [1 ]
Alsayed, Ashwaq [1 ]
Alzahrani, Manar [1 ]
Al-Shareef, Sarah [1 ]
机构
[1] Umm Al Qura Univ, Comp Sci Dept, Mecca, Saudi Arabia
关键词
deep learning; image classification; convolutional neural networks; transfer learning; waste classification; recycling;
D O I
10.22937/IJCSNS.2021.21.8.9
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Waste accumulation is becoming a significant challenge in most urban areas and if it continues unchecked, is poised to have severe repercussions on our environment and health. The massive industrialisation in our cities has been followed by a commensurate waste creation that has become a bottleneck for even waste management systems. While recycling is a viable solution for waste management, it can be daunting to classify waste material for recycling accurately. In this study, transfer learning models were proposed to automatically classify wastes based on six materials (cardboard, glass, metal, paper, plastic, and trash). The tested pre-trained models were ResNet50, VGG16, InceptionV3, and Xception. Data augmentation was done using a Generative Adversarial Network (GAN) with various image generation percentages. It was found that models based on Xception and VGG16 were more robust. In contrast, models based on ResNet50 and InceptionV3 were sensitive to the added machine-generated images as the accuracy degrades significantly compared to training with no artificial data.
引用
收藏
页码:65 / 70
页数:6
相关论文
共 50 条
  • [41] Malware image classification: comparative analysis of a fine-tuned CNN and pre-trained models
    Majhi, Santosh Kumar
    Panda, Abhipsa
    Srichandan, Suresh Kumar
    Desai, Usha
    Acharya, Biswaranjan
    [J]. International Journal of Computers and Applications, 2023, 45 (11) : 709 - 721
  • [42] An efficient ptychography reconstruction strategy through fine-tuning of large pre-trained deep learning model
    Pan, Xinyu
    Wang, Shuo
    Zhou, Zhongzheng
    Zhou, Liang
    Liu, Peng
    Li, Chun
    Wang, Wenhui
    Zhang, Chenglong
    Dong, Yuhui
    Zhang, Yi
    [J]. ISCIENCE, 2023, 26 (12)
  • [43] Enhancing recognition and interpretation of functional phenotypic sequences through fine-tuning pre-trained genomic models
    Du, Duo
    Zhong, Fan
    Liu, Lei
    [J]. JOURNAL OF TRANSLATIONAL MEDICINE, 2024, 22 (01)
  • [44] GRAM: Fast Fine-tuning of Pre-trained Language Models for Content-based Collaborative Filtering
    Yang, Yoonseok
    Kim, Kyu Seok
    Kim, Minsam
    Parkt, Juneyoung
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 839 - 851
  • [45] GamMa: Efficient Fine-Tuning of Pre-Trained Language Models Using Gradient Activation Mapping Masking
    Gui, Anchun
    Ye, Jinqiang
    Xiao, Han
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [46] Empirical study on fine-tuning pre-trained large language models for fault diagnosis of complex systems
    Zheng, Shuwen
    Pan, Kai
    Liu, Jie
    Chen, Yunxia
    [J]. RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 252
  • [47] On the Interplay Between Fine-tuning and Sentence-level Probing for Linguistic Knowledge in Pre-trained Transformers
    Mosbach, Marius
    Khokhlova, Anna
    Hedderich, Michael A.
    Klakow, Dietrich
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2502 - 2516
  • [48] Is Fine-tuning Needed? Pre-trained Language Models Are Near Perfect for Out-of-Domain Detection
    Uppaal, Rheeya
    Hu, Junjie
    Li, Yixuan
    [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 12813 - 12832
  • [49] Poster: Attempts on detecting Alzheimer's disease by fine-tuning pre-trained model with Gaze Data
    Nagasawa, Junichi
    Nakata, Yuichi
    Hiroe, Mamoru
    Zheng, Yujia
    Kawaguchi, Yutaka
    Maegawa, Yuji
    Hojo, Naoki
    Takiguchi, Tetsuya
    Nakayama, Minoru
    Uchimura, Maki
    Sonoda, Yuma
    Kowa, Hisatomo
    Nagamatsu, Takashi
    [J]. PROCEEDINGS OF THE 2024 ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, ETRA 2024, 2024,
  • [50] Enhancing Machine-Generated Text Detection: Adversarial Fine-Tuning of Pre-Trained Language Models
    Hee Lee, Dong
    Jang, Beakcheol
    [J]. IEEE ACCESS, 2024, 12 : 65333 - 65340