Effects of objects and image quality on melanoma classification using deep neural networks

被引:12
|
作者
Gazioglu, Bilge S. Akkoca [1 ]
Kamasak, Mustafa E. [1 ]
机构
[1] Istanbul Tech Univ, Dept Comp Engn, Istanbul, Turkey
关键词
Melanoma classification; Deep learning performance; Image quality; Image degradations; SKIN-LESIONS; FEATURES;
D O I
10.1016/j.bspc.2021.102530
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Melanoma is a type of skin cancer with a higher mortality rates. Early and accurate diagnosis of melanoma has critical importance on its prognosis. Recently, deep learning models dominated the CAD systems for classification of the potential melanoma lesions. In clinical settings, capturing impeccable skin images is not always possible. Sometimes, the skin images can be blurry, noisy or have low contrast or can have additional data. The aim of this work is to investigate the effects of external objects (ruler, hair) and image quality (blur, noise, contrast) on the classification of melanoma using commonly used Convolutional Neural Network (CNN) models: ResNet50, DenseNet121, VGG16 and AlexNet. We applied data augmentation, trained four models separately and tested our six datasets. In our experiments, melanoma images can be classified with higher accuracy under contrast changes unlike the benign images, and we recommend ResNet model when image contrast is an issue. Noise significantly degrades classification accuracy of melanoma compared to benign lesions. In addition, both classes are sensitive to blur changes. Best accuracy is obtained with DenseNet in blurred and noisy datasets. The images that contain ruler have decreased the accuracy and ResNet has higher accuracy in this set. We calculated the highest accuracy in hairy skin images since it has the maximum number of images in overall dataset. We evaluated the accuracies as 89.22% for hair set, 86% for ruler set and 88.81% for none set. We can infer that DenseNet can be used for melanoma classification with image distortions and degradations.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Classification of Low Quality Underwater Objects Using Convolutional Neural Networks and Transfer Learning
    Balakrishnan, Arun A.
    Bijoy, M. S.
    Supriya, M. H.
    OCEANS 2022, 2022,
  • [42] Classification of Road Objects using Convolutional Neural Networks
    Patel, Mann
    Elgazzar, Heba
    2023 IEEE 13TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE, CCWC, 2023, : 326 - 332
  • [43] Image classification by neural networks for the quality control of watches
    Moreira, M
    Fiesler, E
    Pante, G
    PROCEEDINGS ISAI/IFIS 1996 - MEXICO - USA COLLABORATION IN INTELLIGENT SYSTEMS TECHNOLOGIES, 1996, : 141 - 149
  • [44] Classification of objects in the LIDAR point clouds using Deep Neural Networks based on the PointNet model
    Kowalczuk, Zdzislaw
    Szymanski, Karol
    IFAC PAPERSONLINE, 2019, 52 (08): : 416 - 421
  • [45] Interference Classification Using Deep Neural Networks
    Yu, Jianyuan
    Alhassoun, Mohammad
    Buehrer, R. Michael
    2020 IEEE 92ND VEHICULAR TECHNOLOGY CONFERENCE (VTC2020-FALL), 2020,
  • [46] Melanoma and Nevi Classification using Convolution Neural Networks
    Grove, Robert
    Green, Richard
    2020 35TH INTERNATIONAL CONFERENCE ON IMAGE AND VISION COMPUTING NEW ZEALAND (IVCNZ), 2020,
  • [47] Image Classification Using Convolutional Neural Networks
    Filippov, S. A.
    AUTOMATIC DOCUMENTATION AND MATHEMATICAL LINGUISTICS, 2024, 58 (SUPPL3) : S143 - S149
  • [48] Image classification using neural networks and ontologies
    Breen, C
    Khan, L
    Ponnusamy, A
    13TH INTERNATIONAL WORKSHOP ON DATABASE AND EXPERT SYSTEMS APPLICATIONS, PROCEEDINGS, 2002, : 98 - 102
  • [49] Color image classification using neural networks
    Shinmoto, M
    Mitsukura, Y
    Fukumi, M
    Akamatsu, N
    SICE 2002: PROCEEDINGS OF THE 41ST SICE ANNUAL CONFERENCE, VOLS 1-5, 2002, : 1622 - 1626
  • [50] A new objective metric to predict image quality using deep neural networks
    Akyazi, Pinar
    Ebrahimi, Touradj
    APPLICATIONS OF DIGITAL IMAGE PROCESSING XLI, 2018, 10752