Tumor detection in breast cancer pathology patches using a Multi-scale Multi-head Self-attention Ensemble Network on Whole Slide Images

被引:0
|
作者
Ge, Ruigang [1 ]
Chen, Guoyue [2 ]
Saruta, Kazuki [2 ]
Terata, Yuki [2 ]
机构
[1] Akita Prefectural Univ, Grad Sch Syst Sci & Technol, Dept Integrated Syst Sci, Akita 0150055, Japan
[2] Akita Prefectural Univ, Dept Informat & Comp Sci, Akita 0150055, Japan
来源
关键词
Convolutional neural network; Convolutional self-attention; Ensemble model; Multi-scale; Tumor detection;
D O I
10.1016/j.mlwa.2024.100592
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Breast cancer (BC) is the most common type of cancer among women globally and is one of the leading causes of cancer-related deaths among women. In the diagnosis of BC, histopathological assessment is the gold standard, where automated tumor detection technologies play a pivotal role. Utilizing Convolutional Neural Networks (CNNs) for automated analysis of image patches from Whole Slide Images (WSIs) enhances detection accuracy and alleviates the workload of pathologists. However, CNNs often face limitations in handling pathological patches due to a lack of sufficient contextual information and limited feature generation capabilities. To address this, we propose a novel Multi-scale Multi-head Self-attention Ensemble Network (MMSEN), which integrates a multi-scale feature generation module, a convolutional self-attention module, and an adaptive feature integration with an output module, effectively optimizing the performance of classical CNNs. The design of MMSEN optimizes the capture of key information and the comprehensive integration of features in WSIs pathological patches, significantly enhancing the precision of tumor detection. Validation results from a five-fold cross-validation experiment on the PatchCamelyon (PCam) dataset demonstrate that MMSEN achieves a ROC-AUC of 99.01% +/- 0.02%, an F1-score of 98.00% +/- 0.08%, a Balanced Accuracy (B-Acc) of 98.00% +/- 0.08%, and a Matthews Correlation Coefficient (MCC) of 96.00% +/- 0.16% (p<0.05). These results demonstrate the effectiveness and potential of MMSEN in detecting tumors from pathological patches in WSIs for BC.
引用
收藏
页数:26
相关论文
共 50 条
  • [21] Remaining Useful Life Prediction of Bearings Based on Multi-head Self-attention Mechanism, Multi-scale Temporal Convolutional Network and Convolutional Neural Network
    Wei, Hao
    Gu, Yu
    Zhang, Qinghua
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 3027 - 3032
  • [22] Non-Invasive Load Decomposition Method Based on Multi-Scale TCN and Multi-Head Self-Attention Mechanism
    Zhang, Yan
    Li, Fei
    Xiao, Yang
    Li, Kai
    Xia, Lei
    Tan, Huilei
    INTERNATIONAL JOURNAL OF MULTIPHYSICS, 2024, 18 (03) : 547 - 556
  • [23] Multi-Scale Generative Adversarial Network With Multi-Head External Attention for Image Inpainting
    Chen, Gang
    Feng, Qing
    He, Xiu
    Yao, Jian
    IEEE ACCESS, 2024, 12 : 133456 - 133468
  • [24] Drug-Target Interaction Prediction Using Multi-Head Self-Attention and Graph Attention Network
    Cheng, Zhongjian
    Yan, Cheng
    Wu, Fang-Xiang
    Wang, Jianxin
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2022, 19 (04) : 2208 - 2218
  • [25] SPEECH ENHANCEMENT USING SELF-ADAPTATION AND MULTI-HEAD SELF-ATTENTION
    Koizumi, Yuma
    Yatabe, Kohei
    Delcroix, Marc
    Masuyama, Yoshiki
    Takeuchi, Daiki
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 181 - 185
  • [26] A MULTI-SCALE SELF-ATTENTION NETWORK TO DISCRIMINATE PULMONARY NODULES
    Moreno, Alejandra
    Rueda, Andrea
    Martinez, Fabio
    2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,
  • [27] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [28] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [29] MSnet: Multi-Head Self-Attention Network for Distantly Supervised Relation Extraction
    Sun, Tingting
    Zhang, Chunhong
    Ji, Yang
    Hu, Zheng
    IEEE ACCESS, 2019, 7 : 54472 - 54482
  • [30] A Supervised Multi-Head Self-Attention Network for Nested Named Entity Recognition
    Xu, Yongxiu
    Huang, Heyan
    Feng, Chong
    Hu, Yue
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14185 - 14193