An improved classification diagnosis approach for cervical images based on deep neural networks

被引:1
|
作者
Wang, Juan [1 ]
Zhao, Mengying [2 ]
Xia, Chengyi [2 ,3 ]
机构
[1] Tianjin Univ Technol, Sch Elect Engn & Automat, Tianjin 300384, Peoples R China
[2] Tianjin Univ Technol, Tianjin Key Lab Intelligence Comp & Novel Software, Tianjin 300384, Peoples R China
[3] Tiangong Univ, Sch Artificial Intelligence, Tianjin 300387, Peoples R China
关键词
Cervical lesion; Classification diagnosis; Pyramid convolution; Depth-wise separable convolution; Deep neural network; CANCER;
D O I
10.1007/s10044-024-01300-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to enhance the speed and performance of cervical diagnosis, we propose an improved Residual Network (ResNet) by combining pyramid convolution with depth-wise separable convolution to obtain the high-quality cervical classification. Since most of cervical images from patients are not in the center of colposcopy images, we devise the segmentation and extraction algorithm of the center movement of the region of interest (ROI), which will further enhance the classification performance. Extensive experiments indicate that our model can not only achieve the lightweight network model, but also fulfil the classification prediction, such as for three-classification of cervical lesions, the classification accuracy is as high as 91.29%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, the precision is 89.70%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, the sensitivity is 88.75%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, the specificity is 94.98%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, the rate of missed diagnosis is 11.25%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} and the rate of misdiagnosis is 5.02%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}. Finally, after dividing the colposcopy images into four categories, it is shown that our results are still better than those obtained from many previous works as far as the cervical image classification is concerned. The current work can not only assist doctors to quickly diagnose cervical diseases, but also the classification performance can meet some clinical requirements in practice.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Generalizable deep neural networks for image quality classification of cervical images
    Ahmed, Syed Rakin
    Befano, Brian
    Egemen, Didem
    Rodriguez, Ana Cecilia
    Desai, Kanan T.
    Jeronimo, Jose
    Ajenifuja, Kayode O.
    Clark, Christopher
    Perkins, Rebecca
    Campos, Nicole G.
    Inturrisi, Federica
    Wentzensen, Nicolas
    Han, Paul
    Guillen, Diego
    Norman, Judy
    Goldstein, Andrew T.
    Madeleine, Margaret M.
    Donastorg, Yeycy
    Schiffman, Mark
    de Sanjose, Silvia
    Kalpathy-Cramer, Jayashree
    PAVE Study Grp, Ana
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [2] A Deep Neural Network for Cervical Cell Classification Based on Cytology Images
    Fang, Ming
    Lei, Xiujuan
    Liao, Bo
    Wu, Fang-Xiang
    IEEE ACCESS, 2022, 10 : 130968 - 130980
  • [3] A stack autoencoders based deep neural network approach for cervical cell classification in pap-smear images
    Singh S.K.
    Goyal A.
    Recent Advances in Computer Science and Communications, 2021, 14 (01) : 62 - 70
  • [4] Graph Neural Networks Based Approach for Interpersonal Relationship Classification in Images
    Akay, Simge
    Arica, Nafiz
    2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [5] ON CLASSIFICATION OF DISTORTED IMAGES WITH DEEP CONVOLUTIONAL NEURAL NETWORKS
    Zhou, Yiren
    Song, Sibo
    Cheung, Ngai-Man
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 1213 - 1217
  • [6] Automated Classification of Auroral Images with Deep Neural Networks
    Shang, Zhiyuan
    Yao, Zhonghua
    Liu, Jian
    Xu, Linli
    Xu, Yan
    Zhang, Binzheng
    Guo, Ruilong
    Wei, Yong
    UNIVERSE, 2023, 9 (02)
  • [7] SIFT and Tensor Based Object Classification in Images Using Deep Neural Networks
    Najva, N.
    Bijoy, Edet K.
    PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE (ICIS), 2016, : 32 - 37
  • [8] Deep Feature Extraction and Classification of Hyperspectral Images Based on Convolutional Neural Networks
    Chen, Yushi
    Jiang, Hanlu
    Li, Chunyang
    Jia, Xiuping
    Ghamisi, Pedram
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2016, 54 (10): : 6232 - 6251
  • [9] A Neuronal Morphology Classification Approach Based on Deep Residual Neural Networks
    Lin, Xianghong
    Zheng, Jianyang
    Wang, Xiangwen
    Ma, Huifang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 336 - 348
  • [10] Cancers classification based on deep neural networks and emotional learning approach
    Jafarpisheh, Noushin
    Teshnehlab, Mohammad
    IET SYSTEMS BIOLOGY, 2018, 12 (06) : 258 - 263