Deep Alternate Kernel Fused Self-Attention Model-Based Lung Nodule Classification

被引:0
|
作者
Saritha, R. Rani [1 ]
Sangeetha, V. [1 ]
机构
[1] Karpagam Acad Higher Educ, Dept Comp Sci, Coimbatore, Tamil Nadu, India
关键词
pulmonary nodules; nodule detection; nodule classification; deep learning; convolutional neural networks; computer-aided diagnosis; medical imaging; AUTOMATIC DETECTION; SEGMENTATION;
D O I
10.12720/jait.15.11.1242-1251
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Lung cancer causes death with delayed diagnosis and inadequate treatment. Hence there is a need for a computer-aided detection method that can identify the nodule category whether it is benign or malignant to avoid delays in diagnosis with the help of Computerized Tomography (CT) scans. This study proposed a novel architecture Deep Alternate Kernel Fused Self-Attention Model (DAKFSAM) which utilizes the characteristics of the residual network in different forms as well as incorporates the efficiency of the attention model. This model fuses the features extracted from different alternate kernel models in three levels of process with three kinds of alternate kernel models. The self-attention model takes multiple kernel flows' visual attention features and merges them into a form to improve nodule classification efficiency. The performance assessment utilizes the Lung Image Database Consortium- Image Database Resource Initiative (LIDC-IDRI) dataset, and the DAKFSAM mode, as proposed, achieves an F1-Score of 94.85%.
引用
收藏
页码:1242 / 1251
页数:10
相关论文
共 50 条
  • [1] ENHANCED TRANSFORMER-BASED DEEP KERNEL FUSED SELF ATTENTION MODEL FOR LUNG NODULE SEGMENTATION AND CLASSIFICATION
    Saritha, R. Rani
    Gunasundari, R.
    ARCHIVES FOR TECHNICAL SCIENCES, 2024, (31): : 175 - 191
  • [2] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [3] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [4] Lung Nodule Detection Based on Spike-Driven Self-Attention YOLO
    Wei, Xiaoqing
    Lv, Yuchao
    Wang, Hui
    Yang, Peiyin
    Dong, Zheng
    Liu, Ju
    Wu, Qiang
    ADVANCES IN SWARM INTELLIGENCE, PT II, ICSI 2024, 2024, 14789 : 187 - 196
  • [5] SABDM: A self-attention based bidirectional-RNN deep model for requirements classification
    Kaur, Kamaljit
    Kaur, Parminder
    JOURNAL OF SOFTWARE-EVOLUTION AND PROCESS, 2024, 36 (02)
  • [6] Kernel Self-Attention for Weakly-supervised Image Classification using Deep Multiple Instance Learning
    Rymarczyk, Dawid
    Borowa, Adriana
    Tabor, Jacek
    Zielinski, Bartosz
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 1720 - 1729
  • [7] Question classification task based on deep learning models with self-attention mechanism
    Mondal S.
    Barman M.
    Nag A.
    Multimedia Tools and Applications, 2025, 84 (10) : 7777 - 7806
  • [8] Point Cloud Classification Segmentation Model Based on Self-Attention and Edge Convolution
    Shen, Lu
    Yang, Jiazhi
    Zhou, Guoqing
    Huo, Jiaxin
    Chen, Mengqiang
    Yu, Guangwang
    Zhang, Yuyang
    Computer Engineering and Applications, 2023, 59 (19) : 106 - 113
  • [9] A Self-attention Based LSTM Network for Text Classification
    Jing, Ran
    2019 3RD INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2019), 2019, 1207
  • [10] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169