An attention-based deep learning for acute lymphoblastic leukemia classification

被引:5
|
作者
Jawahar, Malathy [1 ]
Anbarasi, L. Jani [2 ]
Narayanan, Sathiya [3 ]
Gandomi, Amir H. [4 ,5 ]
机构
[1] CSIR Cent Leather Res Inst, Leather Proc Technol Div, Chennai, India
[2] Vellore Inst Technol, Sch Comp Sci & Engn, Chennai, India
[3] Vellore Inst Technol, Sch Elect Engn, Chennai, India
[4] Univ Technol Sydney, Fac Engn & Informat Technol, Sydney, NSW 2007, Australia
[5] Obuda Univ, Univ Res & Innovat Ctr EKIK, H-1034 Budapest, Hungary
来源
SCIENTIFIC REPORTS | 2024年 / 14卷 / 01期
关键词
Convolutional neural network; Attention layer; Computer-aided diagnosis; Deep learning models; Leukemia; CELLS;
D O I
10.1038/s41598-024-67826-9
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
The bone marrow overproduces immature cells in the malignancy known as Acute Lymphoblastic Leukemia (ALL). In the United States, about 6500 occurrences of ALL are diagnosed each year in both children and adults, comprising nearly 25% of pediatric cancer cases. Recently, many computer-assisted diagnosis (CAD) systems have been proposed to aid hematologists in reducing workload, providing correct results, and managing enormous volumes of data. Traditional CAD systems rely on hematologists' expertise, specialized features, and subject knowledge. Utilizing early detection of ALL can aid radiologists and doctors in making medical decisions. In this study, Deep Dilated Residual Convolutional Neural Network (DDRNet) is presented for the classification of blood cell images, focusing on eosinophils, lymphocytes, monocytes, and neutrophils. To tackle challenges like vanishing gradients and enhance feature extraction, the model incorporates Deep Residual Dilated Blocks (DRDB) for faster convergence. Conventional residual blocks are strategically placed between layers to preserve original information and extract general feature maps. Global and Local Feature Enhancement Blocks (GLFEB) balance weak contributions from shallow layers for improved feature normalization. The global feature from the initial convolution layer, when combined with GLFEB-processed features, reinforces classification representations. The Tanh function introduces non-linearity. A Channel and Spatial Attention Block (CSAB) is integrated into the neural network to emphasize or minimize specific feature channels, while fully connected layers transform the data. The use of a sigmoid activation function concentrates on relevant features for multiclass lymphoblastic leukemia classification The model was analyzed with Kaggle dataset (16,249 images) categorized into four classes, with a training and testing ratio of 80:20. Experimental results showed that DRDB, GLFEB and CSAB blocks' feature discrimination ability boosted the DDRNet model F1 score to 0.96 with minimal computational complexity and optimum classification accuracy of 99.86% and 91.98% for training and testing data. The DDRNet model stands out from existing methods due to its high testing accuracy of 91.98%, F1 score of 0.96, minimal computational complexity, and enhanced feature discrimination ability. The strategic combination of these blocks (DRDB, GLFEB, and CSAB) are designed to address specific challenges in the classification process, leading to improved discrimination of features crucial for accurate multi-class blood cell image identification. Their effective integration within the model contributes to the superior performance of DDRNet.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Scattering Representation and Attention-Based Residual Learning for Image Classification
    Kaur, Manjeet
    Ahmad, M. Omair
    Swamy, M. N. S.
    2024 IEEE 67TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS, MWSCAS 2024, 2024, : 724 - 728
  • [42] Generalized attention-based deep multi-instance learning
    Lu Zhao
    Liming Yuan
    Kun Hao
    Xianbin Wen
    Multimedia Systems, 2023, 29 : 275 - 287
  • [43] Attention-based deep learning for accurate cell image analysis
    Gao, Xiangrui
    Zhang, Fan
    Guo, Xueyu
    Yao, Mengcheng
    Wang, Xiaoxiao
    Chen, Dong
    Zhang, Genwei
    Wang, Xiaodong
    Lai, Lipeng
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [44] Generalized attention-based deep multi-instance learning
    Zhao, Lu
    Yuan, Liming
    Hao, Kun
    Wen, Xianbin
    MULTIMEDIA SYSTEMS, 2023, 29 (01) : 275 - 287
  • [45] Attention-Based Deep Reinforcement Learning for Edge User Allocation
    Chang, Jiaxin
    Wang, Jian
    Li, Bing
    Zhao, Yuqi
    Li, Duantengchuan
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (01): : 590 - 604
  • [46] Attention-based Deep Learning Model for Text Readability Evaluation
    Sun, Yuxuan
    Chen, Keying
    Sun, Lin
    Hu, Chenlu
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [47] Mobile traffic prediction with attention-based hybrid deep learning
    Wang, Li
    Che, Linxiao
    Lam, Kwok-Yan
    Liu, Wenqiang
    Li, Feng
    PHYSICAL COMMUNICATION, 2024, 66
  • [48] aDFR: An Attention-Based Deep Learning Model for Flight Ranking
    Yi, Yuan
    Cao, Jian
    Tan, YuDong
    Nie, QiangQiang
    Lu, XiaoXi
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT II, 2020, 12343 : 548 - 562
  • [49] Attention-Based Deep Convolutional Capsule Network for Hyperspectral Image Classification
    Zhang, Xiaoxia
    Zhang, Xia
    IEEE ACCESS, 2024, 12 : 56815 - 56823
  • [50] Forward attention-based deep network for classification of breast histopathology image
    Roy S.
    Jain P.K.
    Tadepalli K.
    Reddy B.P.
    Multimedia Tools and Applications, 2024, 83 (40) : 88039 - 88068