ABCD rule and pre-trained CNNs for melanoma diagnosis

被引:18
|
作者
Moura, Nayara [1 ]
Veras, Rodrigo [1 ]
Aires, Kelson [1 ]
Machado, Vinicius [1 ]
Silva, Romuere [2 ]
Araujo, Flavio [2 ]
Claro, Maila [1 ]
机构
[1] Univ Fed Piaui, Teresina, PI, Brazil
[2] Univ Fed Piaui, Picos, PI, Brazil
关键词
Medical image classification; ABCD rule; Pre-trained CNNs; Attribute selection; Multilayer perceptron; CONVOLUTIONAL NEURAL-NETWORKS; PIGMENTED SKIN-LESIONS; TEXTURAL FEATURES; CLASSIFICATION; AGREEMENT;
D O I
10.1007/s11042-018-6404-8
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Skin cancer is the most common type of cancer and represents more than half of cancer diagnoses. Melanoma is the least frequent among skin cancers, but it is the most serious, with high potential for metastasis and can lead to death. However, melanoma is almost always curable if discovered in the early stages. In this context, computational methods for processing and analysis of skin lesion images have been studied and developed. This work proposes a computational approach to assist dermatologists in the diagnosis of skin lesions in melanoma or non-melanoma by means of dermoscopic images. The proposed methodology classifies skin lesions using a descriptor formed by the combination of the ABCD rule (Asymmetry, Border, Color, and Diameter) and pre-trained Convolutional Neural Networks (CNNs) features. The features were selected according to their gain ratios and used as input to the MultiLayer Perceptron classifier. We built a new database joining two distinct databases presented in the literature to validate the proposed methodology. The proposed method achieved an accuracy rate of 94.9% and Kappa index of 89.2%, which is considered excellent.
引用
收藏
页码:6869 / 6888
页数:20
相关论文
共 50 条
  • [41] Pre-trained Affective Word Representations
    Chawla, Kushal
    Khosla, Sopan
    Chhaya, Niyati
    Jaidka, Kokil
    [J]. 2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2019,
  • [42] Pre-trained transformers: an empirical comparison
    Casola, Silvia
    Lauriola, Ivano
    Lavelli, Alberto
    [J]. MACHINE LEARNING WITH APPLICATIONS, 2022, 9
  • [43] Implicit Stereotypes in Pre-Trained Classifiers
    Dehouche, Nassim
    [J]. IEEE ACCESS, 2021, 9 : 167936 - 167947
  • [44] Detecting Backdoors in Pre-trained Encoders
    Feng, Shiwei
    Tao, Guanhong
    Cheng, Siyuan
    Shen, Guangyu
    Xu, Xiangzhe
    Liu, Yingqi
    Zhang, Kaiyuan
    Ma, Shiqing
    Zhang, Xiangyu
    [J]. arXiv, 2023,
  • [45] Pre-trained Models for Sonar Images
    Valdenegro-Toro, Matias
    Preciado-Grijalva, Alan
    Wehbe, Bilal
    [J]. OCEANS 2021: SAN DIEGO - PORTO, 2021,
  • [46] Efficiently Robustify Pre-Trained Models
    Jain, Nishant
    Behl, Harkirat
    Rawat, Yogesh Singh
    Vineet, Vibhav
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 5482 - 5492
  • [47] Detecting Backdoors in Pre-trained Encoders
    Feng, Shiwei
    Tao, Guanhong
    Cheng, Siyuan
    Shen, Guangyu
    Xu, Xiangzhe
    Liu, Yingqi
    Zhang, Kaiyuan
    Ma, Shiqing
    Zhang, Xiangyu
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16352 - 16362
  • [48] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    [J]. ENGINEERING, 2023, 25 : 51 - 65
  • [49] Hyperbolic Pre-Trained Language Model
    Chen, Weize
    Han, Xu
    Lin, Yankai
    He, Kaichen
    Xie, Ruobing
    Zhou, Jie
    Liu, Zhiyuan
    Sun, Maosong
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3101 - 3112
  • [50] USING PRE-TRAINED TEMPORARY HELP
    ZITO, JM
    [J]. TRAINING AND DEVELOPMENT JOURNAL, 1968, 22 (09): : 24 - &