SwinCNN: An Integrated Swin Transformer and CNN for Improved Breast Cancer Grade Classification

被引:2
|
作者
Sreelekshmi, V. [1 ]
Pavithran, K. [2 ]
Nair, Jyothisha J. [1 ]
机构
[1] Amrita Vishwa Vidyapeetham, Amrita Sch Comp, Dept Comp Sci & Engn, Kollam 690525, India
[2] Amrita Inst Med Sci, Dept Med Oncol, Kochi 682041, Kerala, India
来源
IEEE ACCESS | 2024年 / 12卷
关键词
Breast cancer; histopathology images; image processing; multi-class classification; convolutional neural network; transformers;
D O I
10.1109/ACCESS.2024.3397667
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Breast cancer is the most commonly diagnosed cancer among women, globally. The occurrence and fatality rates are high for breast cancer compared to other types of cancer. The World Cancer report 2020 points out early detection and rapid treatment as the most efficient intervention to control this malignancy. Histopathological image analysis has great significance in early diagnosis of the disease. Our work has significant biological and medical potential for automatically processing different histopathology images to identify breast cancer and its corresponding grade. Unlike the existing models, we grade breast cancer by including both local and global features. The proposed model is a hybrid multi-class classification model using depth-wise separable convolutional networks and transformers, where both local and global features are considered. In order to resolve the self-attention module complexity in transformers patch merging is performed. The proposed model can classify pathological images of public breast cancer data sets into different categories. The model was evaluated on three publicly available datasets, like BACH, BreakHis and IDC. The accuracy of the proposed model is 97.800 % on the BACH dataset, 98.130 % on BreakHis dataset and 98.320 % for the IDC dataset.
引用
收藏
页码:68697 / 68710
页数:14
相关论文
共 50 条
  • [1] Fine-Grained Ship Classification by Combining CNN and Swin Transformer
    Huang, Liang
    Wang, Fengxiang
    Zhang, Yalun
    Xu, Qingxia
    REMOTE SENSING, 2022, 14 (13)
  • [2] Efficient Lung Cancer Image Classification and Segmentation Algorithm Based on an Improved Swin Transformer
    Sun, Ruina
    Pang, Yuexin
    Li, Wenfa
    ELECTRONICS, 2023, 12 (04)
  • [3] An improved Swin transformer for sandstone micro-image classification
    Huo, Fengcai
    Li, Hongjiang
    Dong, Hongli
    Ren, Weijian
    GEOENERGY SCIENCE AND ENGINEERING, 2025, 247
  • [4] Mammographic Breast Composition Classification Using Swin Transformer Network
    Tsai, Kuen-Jang
    Yeh, Wei-Cheng
    Kao, Cheng-Yi
    Lin, Ming -Wei
    Hung, Chao -Ming
    Chi, Hung-Ying
    Yeh, Cheng-Yu
    Hwang, Shaw-Hwa
    SENSORS AND MATERIALS, 2024, 36 (05) : 1951 - 1957
  • [5] BTS-ST: Swin transformer network for segmentation and classification of multimodality breast cancer images
    Iqbal, Ahmed
    Sharif, Muhammad
    KNOWLEDGE-BASED SYSTEMS, 2023, 267
  • [6] Classification and Model Explanation of Traditional Dwellings Based on Improved Swin Transformer
    Miao, Shangbo
    Zhang, Chenxi
    Piao, Yushun
    Miao, Yalin
    BUILDINGS, 2024, 14 (06)
  • [7] CSI-Net: CNN Swin Transformer Integrated Network for Infrared Small Target Detection
    Choi, Lammi
    Chung, Won Young
    Park, Chan Gook
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2024, 22 (09) : 2899 - 2908
  • [8] Swin transformer-based fork architecture for automated breast tumor classification
    Uzen, Hueseyin
    Firat, Huseyin
    Atila, Orhan
    Sengur, Abdulkadir
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [9] A Swin Transformer and Residualnetwork Combined Model for Breast Cancer Disease Multi-Classification Using Histopathological Images
    Jianjun Zhuang
    Xiaohui Wu
    Dongdong Meng
    Shenghua Jing
    Instrumentation, 2024, 11 (01) : 112 - 120
  • [10] Classification of Mobile-Based Oral Cancer Images Using the Vision Transformer and the Swin Transformer
    Song, Bofan
    Raj, Dharma K. C.
    Yang, Rubin Yuchan
    Li, Shaobai
    Zhang, Chicheng
    Liang, Rongguang
    CANCERS, 2024, 16 (05)