Deep learning supported breast cancer classification with multi-modal image fusion

被引:0
|
作者
Hamdy, Eman [1 ]
Zaghloul, Mohamed Saad [2 ]
Badawy, Osama [1 ]
机构
[1] Arab Acad Sci Technol & Maritime Transport, Coll Comp & Informat Technol, Alex, Egypt
[2] Arab Acad Sci Technol & Maritime Transport, Coll Engn & Technol, Alex, Egypt
关键词
Breast Neoplasms/classification; Deep learning; DenseNet; Diagnostic imaging; Multimodal Imaging;
D O I
10.1109/ACIT53391.2021.9677099
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Early diagnosis using deep learning of the early-stage breast cancer to enhance the diagnoses, relying on one type of image modality has the risk of missing tumors or false diagnosis. Combining two image modalities (mammography and ultrasound) for better classification and combining information from different modalities can significantly improve classification accuracy. Using dense connections has a great interest in computer vision because they enable gradient flow and attain deep supervision all through training. In particular, for DenseNet every layer is connected to other different layers in a feed-forward style, this proves spectacular performances in natural image classification tasks. The proposed DenseNet 201 with connections between the same path and different paths has overall freedom to learn more complicated combinations among the modalities. Multi-modal images are used to collect features from diverse views and produce complementary information. The experimental outcomes for different parameters (accuracy, recall, precision area under the curve, and F1 score) of the proposed method were 93.83%, 93.83% 93.83%, 95.61%, and 93.8% respectively. The obtained results in diagnosing breast cancer using ultrasound and mammogram images show better performance compared to previous methods in assisting specialists.
引用
收藏
页码:319 / 325
页数:7
相关论文
共 50 条
  • [41] On Multi-modal Fusion Learning in constraint propagation
    Li, Yaoyi
    Lu, Hongtao
    INFORMATION SCIENCES, 2018, 462 : 204 - 217
  • [42] Twitter Demographic Classification Using Deep Multi-modal Multi-task Learning
    Vijayaraghavan, Prashanth
    Vosoughi, Soroush
    Roy, Deb
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 2, 2017, : 478 - 483
  • [43] PowerDetector:Malicious Power Shell Script Family Classification Based on Multi-Modal Semantic Fusion and Deep Learning
    Xiuzhang Yang
    Guojun Peng
    Dongni Zhang
    Yuhang Gao
    Chenguang Li
    China Communications, 2023, 20 (11) : 202 - 224
  • [44] Electromagnetic signal feature fusion and recognition based on multi-modal deep learning
    Hou C.
    Zhang X.
    Chen X.
    International Journal of Performability Engineering, 2020, 16 (06): : 941 - 949
  • [45] Deep Learning Based Multi-Modal Fusion Architectures for Maritime Vessel Detection
    Farahnakian, Fahimeh
    Heikkonen, Jukka
    REMOTE SENSING, 2020, 12 (16)
  • [46] Multi-Modal Physiological Data Fusion for Affect Estimation Using Deep Learning
    Hssayeni, Murtadha D.
    Ghoraani, Behnaz
    IEEE ACCESS, 2021, 9 : 21642 - 21652
  • [47] Multi-modal Fusion Brain Tumor Detection Method Based on Deep Learning
    Yao Hong-ge
    Shen Xin-xia
    Li Yu
    Yu Jun
    Lei Song-ze
    ACTA PHOTONICA SINICA, 2019, 48 (07)
  • [48] Cardiovascular disease detection based on deep learning and multi-modal data fusion
    Zhu, Jiayuan
    Liu, Hui
    Liu, Xiaowei
    Chen, Chao
    Shu, Minglei
    Biomedical Signal Processing and Control, 2025, 99
  • [49] Classifying Excavator Operations with Fusion Network of Multi-modal Deep Learning Models
    Kim, Jin-Young
    Cho, Sung-Bae
    14TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING MODELS IN INDUSTRIAL AND ENVIRONMENTAL APPLICATIONS (SOCO 2019), 2020, 950 : 25 - 34
  • [50] Deep-Learning-Based Multi-Modal Fusion for Fast MR Reconstruction
    Xiang, Lei
    Chen, Yong
    Chang, Weitang
    Zhan, Yiqiang
    Lin, Weili
    Wang, Qian
    Shen, Dinggang
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2019, 66 (07) : 2105 - 2114